Method for social interaction with a robot.

28-12-2018 дата публикации
Номер:
CH0000713934A2
Контакты:
Номер заявки: 7762018
Дата заявки: 07-10-2017

Description

General state of the art

[1]

[0001] Human tasks in the personal care are becoming more and more replaced by autonomous care robot, the help, the needs of daily life in hospital or home care needs to meet. This applies particularly to the care of persons with mental or cognitive impairments or diseases, e.g. dementia. Care robots are equipped with devices for collecting information on the subject in need care and the service environment, i.e. sensors, microphone, camera or intelligent devices, relating to the Internet of things, and means for performing measures, i.e. devices for gripping, moving, communicate. The interaction of the human robot is achieved by intelligent functions such as speech recognition or detection of face facial expression or tactile patterns. These functions may also be imitated by a robot in the situation as, e.g. by voice - or gesture generation or the production of emotional provide feedback.

[2]

[0002] It is a challenge for the robot-assisted care, the actual needs of the person in need of care and service environment to identify and carry out the corresponding measures. For example the person needs of hunger, thirst, the desire for rest, emotional or social interaction are as attention. Needs of the service environment for example the need to clear the table or the kitchen or clean up are fill the refrigerator. The corresponding measures are those, which meet the needs. Generally cannot be determined by the needs and measures only real situation, but depends on the history of needs.

Summary of the invention

[3]

[0003] The invention concerns a method for social interaction with a robot, wherein the robot managers one situation, the situation in a network for determining the needs and an action network for determining the measures to meet the needs is divided, a scheduler for prioritization of measures, the position of an input device manager and optionally from a sensor for detecting an event comprises and proposed. Both the position and the action based on probability models network network. The division of the network into a position and an action causes the situation manager network, suitable for a specific situation that the calculation of the measure is based on the actual data not directly, but on the calculation of the needs of the specific situation.

[4]

[0004] Needs of the person in need of care are for example starvation, thirst, the desire for rest or the desire for emotional attention. Needs of the service environment such as the clear of the table, cleaning up the kitchen or refilling the refrigerator are.

[5]

[0005] Measures to meet the needs are for example a object to the person, it away from the person, emotional speech generation or emotional feedback through to imaging, to clear the table or the kitchen clean up.

[6]

[0006] According to the present invention is divided into a situation where the situation managers network and an action network. The situation as an artificial neural network over the network is designed for decision-making the situation-related needs, therefore the needs in a specific situation, . The situation the cumulative needs of the person in need of care related needs and service environment are measured over time, i.e. the situation based on the history of needs related needs.

[7]

[0007] The action network is an artificial neural network, the appropriate measures for the situation-related needs derives. Both the position and the action based on a probability model network network.

[8]

[0008] The division of the network into a position and an action causes the situation manager network, suitable measures for a specific situation that the calculation of the actual data is based not directly, but on the calculation of the needs of the specific situation.

[9]

[0009] The situation receives inputs from an information swimming pool managers. The information pool comprising signals from sensors and the Internet of things (solder devices), a user database and a history. Sensors according to the present invention are for example a microphone, e.g. for detecting speech patterns, a camera, e.g. for detecting face facial expression pattern, or a touchpad with tactile sensors, e.g. for detecting tactile patterns of the person. The signals detected by the sensor may by speech recognition, facial expression recognition or detection of tactile patterns face are analyzed.

[10]

[0010]Asolder apparatus is for example a refrigerator with sensors to control the durability of its content. The user DB is a collection of information about the persons in need of care, such as their name, the current emotional state or the position in space. The history contains the historical data of the sensors and solder channels, but also personal data, for example the history of the emotional state and the history of the actions of the robot. The information access to communication channels has also swimming pool the open platform, for example to obtain information on the battery status of the robot.

[11]

[0011] The situation before data from the information can be used by managers swimming pool, they must go through the feature preparation. The feature preparation refers to the classification of the analyzed pattern, for example by comparing the pattern with personalized patterns in the user DB, derive the emotional state of the person, or temporal developments to detect signals from solder devices.

[12]

[0012] In the prioritization of measures taken into account the situation and/or the scheduler decisions of the data from input devices such as a manager user input device, a scheduler or an emergency control. An input device is a device, a measure directly by placing the user, for example a button, place a certain care action. The scheduler is a timetable of measures, the must be executed regularly, for example to serve the food, to the medicament. The emergency controller is able, to detect undesirable or negative events, e.g. sign of rejection or resistance against the care robot or a weak battery status. The emergency control has access to the information swimming pool.

[13]

[0013] The prioritization by the scheduler has the effect for example, to monitor the current measure, i.e. it continues to assign the highest priority, to suspend the current measure, i.e. their assign a lower priority, abort the current measure, i.e. measures to delete from the list, a new measure to start or resume a previously interrupted measure.

[14]

[0014] According to the present invention the process for managing the activities of a robot comprising the steps of:

[15]

[0015] Step 1: detecting a signal using a sensor. This step is detected by a signal or pattern, on the patient or the service relating to the environment. The signals or signal pattern for example relate to a position signal, a voice pattern, a pattern, a tactile pattern. If the signal pattern on a tactile pattern relating, the sensor is a contact sensor, for example in a touch pad of the robot is extending. If an emotional state pattern is detected using the sensor, the sensor is a microphone for sensing a voice pattern and/or a camera for detecting a visual facial expression pattern.

[16]

[0016] Step 2: analyzing the signal. This step is the sensed signal or pattern or aggregated interpreted evaluated, for example by extracting features for time series. The signal pattern on a tactile pattern relating, by this step the detected tactile pattern shall be interpreted, for example by extracting features for time series. By this step a emotional state detected pattern, the detected emotional state is interpreted pattern, for example by extracting features for time series.

[17]

[0017] Step 3: classifying the signal. This step are classified by the analyzed characteristics, for example by comparison of the pattern with personalized patterns in the user DB, derive the emotional state of the person, or temporal developments signals from solder devices to recognize. If the signal pattern on a tactile pattern relating, by means of the tactile pattern is classified personalized tactile pattern. This step shall be classified by the extracted features, for example by comparison of the pattern with the personalized tactile patterns in the user DB tactile. If an emotional state pattern is detected, the emotional state is personalized by emotional state pattern classified pattern. This step shall be classified by the extracted features, for example by comparison of the pattern with personalized emotional state in the user DB emotional state patterns.

[18]

[0018] Step 4: determining the needs of the person and the service environment with the aid of the position network. This step are the data from the information needs of the situation according to calculated swimming pool. The situation of artificial neural network as network is designed, based on a probability model. The situation-related needs of the person in need of care needs the cumulative points and service environment over time is measured. The calculation of the situation-related needs by the artificial neural network is based on the actual needs therefore not only, but also on the history of needs.

[19]

[0019] Step 5: determining the measures to meet the situation determined by the network needs. This step are the appropriate measures for the needs of the situation calculated. The artificial neural network is designed as action network, based on a probability model.

[20]

[0020] Step 6: determination of measures, are triggered by an input device. By this step the measures are defined, are triggered by an input device. An input device is a button for example, arrange a certain care action or a scheduler for triggering of actions, the must be executed regularly, or an emergency control.

[21]

[0021] Step 7: prioritizing the measures by the scheduler. This step are measures according to an urgency level prioritized, for example from the highest to lowest priority: (1) emergency measures, (2) from the input device arranged measure, (3) terminated measure, (4) the situation proposed measure by managers.

[22]

[0022] Step 8: executing the measure the highest priority. In this step is the most important measure carried out.

[23]

[0023] Step 9: repeating steps (1) to (9) to a stop condition is reached. This step causes, that the robot always does everything, until it is stopped by an external command for stopping.

[24]

[0024] According to one embodiment of the invention is a user input device and/or the input device and/or a scheduler an emergency control.

[25]

[0025] According to a preferred embodiment of the invention the position and/or the network is based on a probability model action network.

[26]

[0026] According to an important embodiment of the invention is replaced by the situation information from an information managers swimming pool, wherein the information on the Internet and/or the pool on a sensor and/or on a user database and/or things on a history and/or on the open platform uses communication channels.

[27]

[0027] According to another embodiment of the invention are the information, the managers of the situation from the information swimming pool receives, by a noticing corn-type preparatory task classified.

[28]

[0028] The invention also relates to a robot for performing the above-described method, wherein the robot comprises a scheduler for prioritizing of tasks, from a position of an input device manager and optionally received. The situation manager is in a position network for determining needs and to meet the needs of a network for determining action measures divided.

[29]

[0029] According to one embodiment the input device is a user input device and/or a scheduler and/or an emergency control.

[30]

[0030] According to a preferred embodiment the position and/or the network based on a probability model action network.

[31]

[0031] According to an important embodiment is replaced by the situation information from an information managers swimming pool, wherein the information on the Internet and/or the pool on a sensor and/or on a user database and/or things on a history and/or on the open platform uses communication channels.

[32]

[0032] According to a further embodiment the information, the managers of the situation from the information swimming pool receives, by a noticing corn-type preparatory task classified.

[33]

[0033] According to a very important embodiment the sensor has an area of at least 16 mm2 In the tactile pattern can e.g. be measured by the sensor well..

[34]

[0034] The sensor into a soft tactile envelope of the robot can finally be embedded. This can also be e.g. the tactile pattern by the sensor well.

Brief description of drawings

[35]

[0035]

[36]

1 Is a diagram figfig., the flow of the robot according to the present invention the flow of information and decision represents.

[37]

2A is a flow diagram figfig., the workflow of the robot in the monitoring mode shows.

[38]

2B is a flow diagram figfig., the workflow of the robot in the tactile interaction mode shows.

[39]

2C is a flow diagram figfig., the workflow of the robot in the social interaction mode shows.

[40]

[0036] Figfig. shows the flow of information and decision flow of care robot 1. The core component of the maintenance robot is a planner. It is the task of the planner, prioritizing actions and the execution of a specific care measures in call situations.

[41]

[0037] Measures are for example the position to change, or to take up an object or the kitchen. In the prioritization of actions taken into account the situation and/or the scheduler decisions of the input devices such as a user-input device manager, a scheduler or an emergency control.

[42]

[0038] It is the task of the state manager, said scheduler to provide the measures, which the person needs, for example starvation, thirst, stress-reduction or needs of the care and service environment in a concrete situation meet. The situation manager being responsive to request of the planner. The situation according to the present invention is divided into a situation of network managers and an action network. The situation as an artificial neural network over the network is designed for decision-making the situation-related needs, therefore the needs in a specific situation, . The situation-related needs of the person in need of care needs the cumulative points and service environment over time is measured, i.e. the situation based on the history of needs related needs.

[43]

[0039] The action network is an artificial neural network, the appropriate measures for the situation-related needs derives. Both the position and the action based on a probability model network network.

[44]

[0040] The division of the network into a position and an action causes the situation manager network, suitable measures for a specific situation that the calculation of the data of the information rather than directly to the pool is based, it is based on the separate calculation of the needs for a specific situation instead.

[45]

[0041] The situation receives inputs from an information swimming pool managers. The information pool comprising information from sensors and solder devices, a user database and a history. Sensors according to the present invention are for example a microphone, a camera, a touch pad. For example a refrigerator or other intelligent devices is a solder apparatus. The user database is a collection of information about the persons in need of care, for example their name, the current emotional state or the current position in space. The history contains the historical data of the sensors and solder channels as well as the history of the actions of the robot and the conditions of the care for dependants history. The information access to communication channels has also swimming pool the open platform, for example to obtain information on the battery status of the robot.

[46]

[0042] Data from the information can be used by managers swimming pool before the situation, they must go through the feature preparation. The feature preparation refers to the classification or accumulation of information, for example on the classifying speech recognition, the classification touches on tactile recognition, facial expression recognition conditions on the classification of feeling face, the accumulation of information from intelligent devices for detecting developments.

[47]

[0043]Akey associated with an input device can function, be a touch screen. The scheduler is a timetable of measures, the must be executed regularly, e.g. to the food, provide the drugs. The emergency controller is able, to detect undesirable or negative events, e.g. sign of rejection or resistance against the care robot or a weak battery status. The emergency control has access to the information swimming pool.

[48]

[0044] The prioritization by the scheduler has the effect for example, to monitor the current measure, i.e. it continues to assign the highest priority, to suspend the current measure, i.e. their assign a lower priority, abort the current measure, i.e. measures to delete from the list, a new measure to start or resume a previously interrupted measure.

[49]

[0045] Figfig. 2a shows a flow chart, the operation of the robot in the monitoring mode shows. The method comprising the steps of:

[50]

[0046] Step 1: detecting a signal using a sensor. This step is detected by a signal or pattern, on the patient or the service relating to the environment. The signals or signal pattern for example relate to a position signal, a voice pattern, a pattern, a tactile pattern.

[51]

[0047] Step 2: analyzing the signal. This step is the sensed signal or pattern or aggregated interpreted evaluated, for example by extracting features for time series.

[52]

[0048] Step 3: classifying the signal. This step are classified by the analyzed characteristics, for example by comparison of the pattern with personalized patterns in the user DB, derive the emotional state of the person, or temporal developments signals from solder devices to recognize.

[53]

[0049] Step 4: determining the needs of the person and the service environment with the aid of the position network. This step are the data from the information needs of the situation according to calculated swimming pool. The situation of artificial neural network as network is designed, based on a probability model. The situation-related needs of the person in need of care needs the cumulative points and service environment over time is measured. The calculation of the situation-related needs by the artificial neural network is based on the actual needs therefore not only, but also on the history of needs.

[54]

[0050] Step 5: determining the measures to meet the situation determined by the network needs. This step are the appropriate measures for the needs of the situation calculated. The artificial neural network is designed as action network, based on a probability model.

[55]

[0051] Step 6: determination of measures, are triggered by an input device. By this step the measures are defined, are triggered by an input device. An input device is a button for example, arrange a certain care action or a scheduler for triggering of actions, the must be executed regularly, or an emergency control.

[56]

[0052] Step 7: prioritizing the measures by the scheduler. This step are measures after a prioritized urgency level, e.g. from the highest to lowest priority: (1) emergency measures, (2) from the input device arranged measure, (3) terminated measure, (4) the situation proposed measure by managers.

[57]

[0053] Step 8: executing the measure the highest priority. In this step is the most important measure carried out.

[58]

[0054] Step 9: repeating steps (1) to (9) to a stop condition is reached. This step causes, that the robot always does everything, until it is stopped by an external command for stopping.

[59]

[0055] 2B shows a flow chart figfig., the workflow of the robot in the tactile interaction mode shows. The method comprising the steps of:

[60]

[0056] Step 1: detecting a tactile pattern by a sensor. This step is related to the patient by a tactile pattern detected.

[61]

[0057] Step 2: analyzing the tactile signal by an analysis unit. This step is the detected tactile pattern evaluated interpreted or aggregated, for example by extracting features for time series.

[62]

[0058] Step 3: classifying the tactile signal using personalized tactile patterns. This step are classified by the analyzed characteristics, for example by comparison of the pattern with personalized patterns in the user DB, derive the emotional state of the person, or temporal developments signals from solder devices to recognize.

[63]

[0059] Step 4: determining the needs of the person using the position network. This step are the data from the information needs of the situation according to calculated swimming pool. The situation of artificial neural network as network is designed, based on a probability model. The situation-related needs of the person in need of care needs the cumulative points and service environment over time is measured. The calculation of the situation-related needs by the artificial neural network is based on the actual needs therefore not only, but also on the history of needs.

[64]

[0060] Step 5: determining the measures to meet the situation determined by the network needs. This step are the appropriate measures for the needs of the situation calculated. The artificial neural network is designed as action network, based on a probability model.

[65]

[0061] Step 6: determination of measures, are triggered by an input device. By this step the measures are defined, are triggered by an input device. An input device is a button for example, arrange a certain care action or a scheduler for triggering of actions, the must be executed regularly, or an emergency control.

[66]

[0062] Step 7: prioritizing the measures by the scheduler. This step are measures after a prioritized urgency level, e.g. from the highest to lowest priority: (1) emergency measures, (2) from the input device arranged measure, (3) terminated measure, (4) the situation proposed measure by managers.

[67]

[0063] Step 8: executing the measure the highest priority. In this step is the most important measure carried out.

[68]

[0064] Step 9: repeating steps (1) to (9) to a stop condition is reached. This step causes, that the robot always does everything, until it is stopped by an external command for stopping.

[69]

[0065] Figfig. 2c is a flow chart, the operation of the robot in the social interaction mode shows. The method comprising the steps of:

[70]

[0066] Step 1: detecting an emotional state pattern by a sensor. This step is a on the patient-related emotional state detected pattern.

[71]

[0067] Step 2: analyzing the emotional state analysis unit by a pattern. By this step the detected emotional state is interpreted or evaluated aggregated pattern, for example by extracting features for time series.

[72]

[0068] Step 3: classifying the emotional state pattern with the aid of personalized emotional state patterns. This step are classified by the analyzed characteristics, for example by comparison of the pattern with personalized patterns in the user DB, derive the emotional state of the person, or temporal developments signals from solder devices to recognize.

[73]

[0069] Step 4: determining the needs of the person using the position network. This step are the data from the information needs of the situation according to calculated swimming pool. The situation of artificial neural network as network is designed, based on a probability model. The situation-related needs of the person in need of care needs the cumulative points and service environment over time is measured. The calculation of the situation-related needs by the artificial neural network is based on the actual needs therefore not only, but also on the history of needs.

[74]

[0070] Step 5: determining the measures to meet the situation determined by the network needs. This step are the appropriate measures for the needs of the situation calculated. The artificial neural network is designed as action network, based on a probability model.

[75]

[0071] Step 6: determination of measures, are triggered by an input device. By this step the measures are defined, are triggered by an input device. An input device is a button for example, arrange a certain care action or a scheduler for triggering of actions, the must be executed regularly, or an emergency control.

[76]

[0072] Step 7: prioritizing the measures by the scheduler. This step are measures after a prioritized urgency level, e.g. from the highest to lowest priority: (1) emergency measures, (2) from the input device arranged measure, (3) terminated measure, (4) the situation proposed measure by managers.

[77]

[0073] Step 8: executing the measure the highest priority. In this step is the most important measure carried out.

[78]

[0074] Step 9: repeating steps (1) to (9) to a stop condition is reached. This step causes, that the robot always does everything, until it is stopped by an external command for stopping.



[79]

The invention relates to a method for controlling the activities of a robot whereby the robot comprises a situation manager which is divided into a situation network for determining needs and an action network for determining the actions for satisfying the needs, a planner for prioritizing actions proposed by the situation manager and optionally from an input device, and a sensor for detecting an event. Both, the situation network and the action network are based on probability models. Subdividing the situation manager into a situation network and an action network has the effect that the calculation of the proper action for a given situation is not directly based on the actual data, rather it is based on the calculation of the needs of the given situation.



1. Method for social interaction with a robot, wherein the robot comprises

one situation - managers, the situation in a network for determining needs and into a network for determining the measures to meet the needs action is divided,

- a scheduler for prioritizing of tasks, the situation of an input device manager and optionally received by the

- a sensor for observing the emotional state of a person

comprising the following steps:

Step 1: detecting the emotional state pattern by the sensor

Step 2: analyzing the emotional state analysis unit pattern by means of a

Step 3: classifying the emotional state pattern with the aid of personalized emotional state patterns, are stored in a user database

Step 4: determining the needs by means of the position network

Step 5: determining the measures to meet the needs identified in step 4 by means of the action network

Step 6: determining measures, are triggered by the input device

Step 7: prioritizing the measures by the scheduler

Step 8: executing the measure the highest priority

Step 9: repeating step (1) to (9)

2. Method according to claim 1, wherein the input device is a user input device and/or a scheduler and/or an emergency control is.

3. Method according to claims 1 or 2, wherein the position and/or the action based on a probability model network network.

4. Method according to claims 1 to 3, wherein the situation information obtained from an information managers swimming pool, wherein the information on the Internet and/or the pool on a sensor and/or on a user database and/or things on a history and/or on the open platform uses communication channels.

5. Method according to claim 4, wherein the information, the managers of the situation from the information swimming pool receives, by a noticing corn-type preparatory task is classified.

6. Robot for carrying out the method according to claims 1 to 5, wherein the robot a scheduler for prioritizing of tasks, the situation manager and optionally of an input device by which it receives, and a sensor for detecting a tactile pattern comprises, characterized in that the situation manager into a position network for determining needs and into a network for determining the measures to meet the needs action is divided.

7. Robot according to claim 6, wherein the input device is a user input device and/or a scheduler and/or an emergency control is.

8. Robot according to claims 6 or 7, wherein the position and/or the action based on a probability model network network.

9. Robot according to claims 6 to 8, wherein the situation information obtained from an information managers swimming pool, wherein the information on the Internet and/or the pool on a sensor and/or on a user database and/or things on a history and/or on the open platform uses communication channels.

10. Robot according to claim 9, wherein the information, the managers of the situation from the information swimming pool receives, by a noticing corn-type preparatory task is classified.

11. Robot according to claims 6 to 10, wherein the sensor is a microphone and/or is a camera.

12. Robot according to claims 6 to 11, further comprising a voice generation unit and wherein the robot or a picture display unit comprises /.