When we hear the term “robot”, we often think first of the highly automated machinery used in industrial production. However, a new generation of robots, also known as “social robots”, is conquering completely new areas – namely those that require intensive interaction with humans. Social robots are capable of learning and process a lot of data locally. This underlines the importance of robust data protection measures to minimize potential risks for both manufacturers and users.

Social robots are designed to have animated conversations with people, for example in stores, at airport info points or at trade fair stands. In some countries, such as Japan, the waiter in a restaurant and the concierge or room service in a hotel are already social robots – other countries and other sectors are sure to follow.

Future market or dreams of the future?

Various retirement and nursing homes in Germany are currently testing social robots. This use case is very exciting, as care staff in such facilities often have too little time for leisurely interaction with residents. After all, care homes are struggling with tightly scheduled daily routines, increasing documentation requirements and a noticeable staff shortage.

When you see videos online showing how naturally social robots communicate with people and how impressed nursing home residents are with this initially unfamiliar exchange, you immediately realize that this is a future market with considerable potential.

Social robots, such as the much-discussed robot Pepper or its new, attractively designed colleague Navel from the Munich-based start-up Navel Robotics, could relieve the burden on stressed staff. This is because they are able to build relationships with their counterparts – even beyond the individual conversation.

AI meets the GDPR

Artificial intelligence provides the basis for the behavior of social robots. AI algorithms enable them to move autonomously in their environment, recognize people, interpret facial expressions and engage people in meaningful (or at least seemingly meaningful) conversations.

According to the manufacturer, Navel processes and analyzes the image data of facial expressions locally and deletes the image data immediately after analysis. For spoken interaction, on the other hand, it requires a cloud connection that must be GDPR-compliant.

This is a key point: social robotics cannot work without data security and data protection. This is because in sensitive areas such as clinics, retirement, or nursing homes (and by no means only there), personal data must always be protected in accordance with the state of the art and relevant regulations.

What does this mean in concrete terms? If a robot uses cloud services such as ChatGPT for voice processing and conversational capabilities, several points must be regulated for use within the EU: The data must not be transferred to third countries that are unsafe under data protection law. In other words, the cloud provider’s data center should preferably be located in the EU legal area and the provider must contractually guarantee that it will not use the user organization’s data for its AI model training.

This is a high hurdle and difficult to verify. This is one of the reasons why social robots are likely to follow the same path as Navel Robotics with its facial analyses: towards local data processing directly on the robot. This also avoids technical problems such as a lack of Wi-Fi coverage or excessive network latency.

However, local data processing also poses challenges. To build long-term social relationships, the robot needs to store a wide range of personal data, including the profiles of recognized faces, knowledge of the preferences of conversation partners and the history of its conversations with them.

Encrypted, tamper-proof data storage is important

This data must always be immediately and reliably retrievable and at the same time securely stored on the robot. Above all, this requires the use of encrypted storage media with end-to-end protection for the entire data path.

This means that the data of the AI model of a social robot – including the data that it learns in the course of its everyday life – should be stored in an encrypted, tamper-proof data store. Access to this data storage and its administration must be clearly regulated via a role and rights model. This is the only way to ensure that interactions and therefore personal data remain protected, even if someone removes the storage medium from the robot.

Even in the case of remote access – required for remote maintenance, for example – this data must remain protected against unauthorized use. And if the robot communicates with third-party systems via interfaces (APIs), these channels must of course also be protected against data exfiltration. These are all hurdles that can be overcome today. The technology is available.

Whether they are start-ups or established players, manufacturers of social robots who want to gain a foothold on the European market must guarantee a very high level of data security and data protection in accordance with EU law (GDPR, AI Regulation, etc.). Depending on the application, these requirements also have a high priority outside Europe. The following therefore applies: AI may be the brain of the social robot, but secure data storage is its heart. Both providers and users of social robots should bear this in mind when designing and selecting products. 

Disclaimer: This article is sourced from the official Swissbit website. As official partners of Swissbit, we have obtained permission to utilize both articles & resources for further updates with regards to Swissbit’s products.