Regulatory challenges of robotics: some guidelines for addressing legal and ethical issues


Ronald Leenes, Erica Palmerini, Bert-Jaap Koops, Andrea Bertolini, Pericle Salvini & Federica Lucivero, Regulatory challenges of robotics: some guidelines for addressing legal and ethical issues, Law, Innovation and Technology, Pages 1-44 | Received 01 Mar 2017, Accepted 07 Mar 2017, Published online: 23 Mar 2017, free download at

Robots are slowly, but certainly, entering people’s professional and private lives. They require the attention of regulators due to the challenges they present to existing legal frameworks and the new legal and ethical questions they raise. This paper discusses four major regulatory dilemmas in the field of robotics: how to keep up with technological advances; how to strike a balance between stimulating innovation and the protection of fundamental rights and values; whether to affirm prevalent social norms or nudge social norms in a different direction; and, how to balance effectiveness versus legitimacy in techno-regulation. The four dilemmas are each treated in the context of a particular modality of regulation: law, market, social norms, and technology as a regulatory tool; and for each, we focus on particular topics – such as liability, privacy, and autonomy – that often feature as the major issues requiring regulatory attention. The paper then highlights the role and potential of the European framework of rights and values, responsible research and innovation, smart regulation and soft law as means of dealing with the dilemmas.

KEYWORDS: Robotics, regulation, regulatory dilemmas, technology regulation, smart regulation, responsible innovation, soft law

Enhancing accountability in the cloud


Martin Gilje Jaatun, Siani Pearson, Frédéric Gittler, Ronald Leenes, Maartje Niezen, Enhancing accountability in the cloud, International Journal of Information Management (2016),

This article focuses on the role of accountability within information management, particularly in cloud computing contexts. Key to this notion is that an accountable Cloud Provider must demonstrate both willingness and capacity for being a responsible steward of other people’s data. More generally, the notion of accountability is defined as it applies to the cloud, and a conceptual model is presented related to the provision of accountability of cloud services. This allows a consideration of accountability at various different levels of abstraction, including the operationalisation of accountability. It is underpinned by fundamental requirements for strong accountability, which in particular are aimed at avoiding risks in the provision and verification of accounts (that include different types of accountability evidence and notifications, that may need to be provided to other cloud actors including data subjects, cloud customers and regulators). In addition, the article sketches what kind of tools, mechanisms and guidelines support this in practice, and discusses these in the light of the upcoming European Data Protection Regulation.

Under Observation: The Interplay Between eHealth and Surveillance


Samantha Adams, Nadezhda Purtova, Ronald Leenes, Under Observation: The Interplay Between eHealth and Surveillance, Dordrecht, etc: Springer, 2017, DOI: 10.1007/978-3-319-48342-9,

The essays in this book clarify the technical, legal, ethical, and social aspects of the interaction between eHealth technologies and surveillance practices. The book starts out by presenting a theoretical framework on eHealth and surveillance, followed by an introduction to the various ideas on eHealth and surveillance explored in the subsequent chapters. Issues addressed in the chapters include privacy and data protection, social acceptance of eHealth, cost-effective and innovative healthcare, as well as the privacy aspects of employee wellness programs using eHealth, the use of mobile health app data by insurance companies, advertising industry and law enforcement, and the ethics of Big Data use in healthcare. A closing chapter draws on the previous content to explore the notion that people are ‘under observation’, bringing together two hitherto unrelated streams of scholarship interested in observation: eHealth and surveillance studies. In short, the book represents a first essential step towards cross-fertilization and offers new insights into the legal, ethical and social significance of being ‘under observation’.

The Cookiewars – From regulatory failure to user empowerment?


The European regulator has relatively early on seen the potential privacy harms of cookies as means to facilitate the tracking and tracing of individuals as the browse the internet. The ePrivacy Directive regulates the use of cookies (amongst other mechanisms) in this respect, requiring the affected individual’s informed consent. The regulation has, so far, not been very successful in limiting the amount of tracking and tracing of individuals (primarily for the purpose of personalised, or behavioural advertising). It has been strongly opposed by the relevant industries, has seen a very low level of compliance and where compliance exists has been very slow in the making. Furthermore, ironically, the regulatory benefactors, individuals, have also opposed the regulation.

The battle to stop the unconsented tracking \& tracing of individuals seems particualrly lost now that the implementation of the cookie law’s requirement by and large seems to have moved from requiring the individual’s consent for the placement and use of cookies (thus providing the individual with a choice not to be tracked) to a mere acknowledgement that cookies will be used (and hence individuals will be traced, no matter what they want). The industry has succeeded in completely subverting and undermining the regulation’s aim. The ‘cookie law’ can thus be seen as an example of regulatory failure in the domain of privacy and data protection.

However, the cavalry might be around the corner. Although ad-blockers, which by and large also block tracking-cookies from being installed on the user equipment, have been around for some years, their use was until recently confined to techies and nerds. In the last couple of years this has been changing. Ironically, the popularity of Google Chrome goes hand in hand with the rise of ad-blockers on desktops (and laptops). Until recently, ad-blockers did not exist on one of the most important platforms for advertising revenues, iOS. This has changed with the launch of iOS 9 in mid September 2015. Suddenly ad-blockers are clearly on everyones agenda, either as threat or blessing. The adoption rate of both iOS 9 and Safari ad-blockers is stunning and might represent a significant factor to change the ad and tracing game altogether.

This contribution explores the ongoing cookie-wars by discussing the move from regulation to the market and code as modalities for the regulation of human behaviour.

Ronald Leenes (2015), The Cookiewars – From regulatory failure to user empowerment?, in: Marc van Lieshout & Jaap-Henk Hoepman (eds), The Privacy & Identity Lab; 4 years later, Nijmegen: The Privacy & Identity Lab, pp. 31-49, ISBN: 978-90-824835-0-5. available here:

The Governance of Cybersecurity


The Governance of Cybersecurity: A comparative quick scan of approaches in Canada, Estonia, Germany, the Netherlands and the UK, Adams, S., Brokx, M., Dalla Corte, L., Savic, M., Kala, K., Koops, B. J., Leenes, R., Schellekens, M., E Silva, K. & Skorvánek, I. Nov 2015 Tilburg University. 166 p.

Taming the Cookie Monster with Dutch Law – A Tale of Regulatory Failure


Ronald Leenes & Eleni Kosta, Taming the Cookie Monster with Dutch Law – A Tale of Regulatory Failure, CSLR 31 (2015), pp. 317-335

Profiling the online behaviour of Internet users has become a defining feature of the Internet. Individual surfing behaviour is tracked by many enterprises for statistical purposes, but also for behavioural advertising and other personalisation services. Profiling implies the processing of personal data often facilitated by cookies and other markers placed on the terminal equipment of Internet users. The European rules for the regulation of cookies and similar technologies were modified in 2009 requiring prior consent of the user, in order to guarantee that the user has some control over the processing of their information. In 2013 the Netherlands introduced probably the strictest implementation of the European rules concerning the installation of cookies. However, in practice the new legal requirements resulted in neglect of the obligations regarding user information on the one hand and in the widespread deployment of annoying banners, popup screens and ‘cookie walls’ on the other. Not only the advertising industry, but also web publishers and even ordinary Internet users opposed the regulation. Furthermore, the regulation, certainly initially, did not lead to increased user control. These and other factors support the conclusion that the Dutch cookie regulation is a case of regulatory failure. This paper discusses the practices that were deployed in the Netherlands and assesses them based on a multi-site study that examined the practices of 100 Dutch websites with regard to the installation of cookies. It further reflects on the response of the Dutch regulator, who – under the pressure of industry and consumers outcry – amended the relevant provisions of the Dutch Telecommunications Act in 2014.

Leenes, Ronald E. and Kosta, Eleni, Taming the Cookie Monster with Dutch Law – A Tale of Regulatory Failure (March 10, 2015). Forthcoming in CLSR, Vol. 31.2; Tilburg Law School Research Paper. Available at SSRN:

Laws on Robots, Laws by Robots, Laws in Robots: Regulating Robot Behaviour by Design


Ronald Leenes & Federica Lucivero, Laws on Robots, Laws by Robots, Laws in Robots: Regulating Robot Behaviour by Design, in Law, Innovation, and Technology, (2014) 6(2) LIT 194–222, DOI:

Speculation about robot morality is almost as old as the concept of the robot itself. Asimov’s three laws of robotics provide an early and widely discussed example of the moral rules that robots should observe. Despite the widespread influence of the three laws of robotics and their role in shaping visions of future robo-dense worlds, these laws have been neglected by hands-on roboticists, who have been busy addressing less abstract questions about robots’ behaviour concerning space locomotion, obstacle avoidance and automatic learning, amongst other things. However, robots should not only be able to perform these locomotive and haptic acts to function successfully in society; when robots enter our everyday lives they will also have to observe social and legal norms. For example, social robots in hospitals are expected to observe social rules, and robotic dust cleaners scouring the streets for waste as well as automated cars will have to observe traffic regulations. In this article we elaborate on the various ways in which robotic behaviour is regulated. We distinguish between imposing regulations on robots, imposing regulation by robots, and imposing regulation in robots. In doing this, we distinguish between regulation that aims at influencing human behaviour and regulation whose scope is robots’ behaviour. We claim that the artificial agency of robots requires designers and regulators to look at the issue of how to regulate robot behaviour in a way that renders it compliant with legal norms. Regulation by design offers a means for this. We further explore this idea through the example of automated cars.

Get your preprint copy here.

Towards Strong Accountability for Cloud Service Providers


Martin Gilje Jaatun, Siani Pearson, Frédéric Gittler, Ronald Leenes, to appear in Proceedings of IEEE CloudCom 2014

In order to be an accountable organisation, Cloud Providers need to commit to being responsible stewards of other people’s information. This implies demonstrating both willingness and capacity for such stewardship. This paper outlines the fundamental requirements that must be met by accountable organisations, and sketches what kind of tools, mechanisms and guidelines support this in practice.

Timing the Right to Be Forgotten


Paulan Korenhof, Jef Ausloos, Ivan Szekely, Meg Ambrose, Giovanni Sartor, and Ronald Leenes (2015), Timing the Right to Be Forgotten: A Study into “Time” as a Factor in Deciding About Retention or Erasure of Data, in: S. Gutwirth et al. (eds.), Reforming European Data Protection Law,Law, Governance and Technology Series 20, Dordrecht: Springer, 171-201, DOI 10.1007/978-94-017-9385-8__7

Abstract The so-called “Right to Be Forgotten or Erasure” (RTBF), article 17 of the proposed General Data Protection Regulation, provides individuals with a means to oppose the often persistent digital memory of the Web. Because digital information technologies affect the accessibility of information over time and time plays a fundamental role in biological forgetting,‘time’ is a factor that should play a pivotal role in the RTBF.This chapter explores the roles that ‘time’ plays and could plain decisions regarding the retention or erasure of data. Two roles are identified: (1) ‘time’ as the marker of a discrete moment where the grounds for retention no longer hold and ‘forgetting’ of the data should follow and (2) ‘time’ as a factor in the balance of interests, as adding or removing weight to the request to ‘forget’ personal information or its opposing interest. The chapter elaborates on these two roles from different perspectives and highlights the importance and underdeveloped understanding of the second role.


Open-source intelligence and privacy by design


Koops, E.J., Leenes, R.E., & Hoepman, J.H. (2013). Open-source intelligence and privacy by design. Computer Law and Security Review29(6), 676-688 more.


As demonstrated by other papers on this issue, open-source intelligence (OSINT) by state authorities poses challenges for privacy protection and intellectual-property enforcement. A possible strategy to address these challenges is to adapt the design of OSINT tools to embed normative requirements, in particular legal requirements. The experience of the VIRTUOSO platform will be used to illustrate this strategy. Ideally, the technical development process of OSINT tools is combined with legal and ethical safeguards in such a way that the resulting products have a legally compliant design, are acceptable within society (social embedding), and at the same time meet in a sufficiently flexible way the varying requirements of different end-user groups. This paper uses the analytic framework of privacy design strategies (minimise, separate, aggregate, hide, inform, control, enforce, and demonstrate), arguing that two approaches for embedding legal compliance seem promising to explore in particular. One approach is the concept of revocable privacy with spread responsibility. The other approach uses a policy mark-up language to define Enterprise Privacy Policies, which determine appropriate data handling.

Both approaches are tested against three requirements that seem particularly suitable for a ‘compliance by design’ approach in OSINT: purpose specification; collection and use limitation and data minimisation; and data quality (up-to-dateness). For each requirement, the paper analyses whether and to what extent the approach could work to build in the requirement in the system. The paper concludes that legal requirements cannot be embedded fully in OSINT systems. However, it is possible to embed functionalities that facilitate compliance in allowing end-users to determine to what extent they adopt a ‘privacy-by-design’ approach when procuring an OSINT platform, extending it with plug-ins, and fine-tuning it to their needs. The paper argues that developers of OSINT platforms and networks have a responsibility to make sure that end-users are enabled to use privacy by design, by allowing functionalities such as revocable privacy and a policy-enforcement language.