ABSTRACT

One of the consequences of the fast development of technology is that public policymakers struggle to develop policies that adequately frame the technology impacts, challenges and opportunities in time. This favours private actors, who develop their own standards, decentralizing the power to regulate. Industrial standards have been developed to govern industrial robot technology and one type of service robots – personal care robots. This chapter explains the complex intertwinement between public and private regulators in the case of robot technology.

We argue that while safety requirements have been set for these types of robots, legal principles and values deeply embedded in the social environment where these are implemented – privacy, dignity, data protection, cognitive safety, autonomy or ethics – have been disregarded, leaving the users’ (fundamental) rights ignored. Moreover, there are currently no safety requirements for other types of robots, including surgical, therapeutic, rehabilitation, educational and sexual; specific requirements for different types of users are missing; and a code of practice for robot developers, even if being discussed, is currently unavailable.

Public policymaking can ensure a broader multi-stakeholder protected scope, but its abstractness often fails to intelligibility and applicability. On the other side, whereas private standards may be much more concrete, most of the time, they are made by voluntary work with no juridical guarantees, normally reflecting industry interests. To give comprehensive protection to robot users without losing technical concreteness, we advocate for a better intertwinement between private standard-setting and public lawmaking approaches.