Autonomous Cyber Physical Systems
Autonomous Cyber Physical Systems (ACPS) are at the center of the digital transformation. The term refers to the tight conjoining of and coordination between computational and physical resources. ACPS are controlled or monitored by computer-based algorithms while being embedded into our daily lives. Application examples for ACPS are autonomous cars, smart grids, e-health assistance systems or industry 4.0.
While the excitement about ACPS is high (e.g. because they promise to relieve humans from tedious or potentially erroneous tasks and are advancing with an enormous speed), there are growing normative, societal and legal concerns about the acceptance, cooperation with and governance of such systems. Despite the increasing “smartness” of ACPS, do we know what decisions are taken when for what reasons? Can we assess at all whether the rationale underlying such decision-making is compliant with our societal values and consistent with our individual value system? What are the effects of an ACPS on the cooperation and collaboration of a human partner in work and day-to day life environments? What are the ethical and societal norms under which ACPS should be designed and implemented? These questions are at the center of SEAS. We seek to address these questions using an integrative research approach combining insights from social sciences, philosophy, psychology and law with technological expertise from computer sciences. The integrative research approach is based on two guiding theoretical constructs; the social embeddedness of technological systems and self-explanation as a distinct feature of future ACPS design.
SEAS analyzes the social embeddedness of ACPS and the potential effect of self-explaining ACPS from an interdisciplinary perspective as illustrated in Figure 1. We distinguish three analytical dimensions, which we label Acceptance, Cooperation and Governance of ACPS. Acceptance focusses on the individual attitudes, norms and capabilities towards ACPS. It concentrates on the development of pertinent methods and tools for the construction of relevant justifications for diverse types of decision situations. Cooperation focusses on man-machine interaction. Success of cooperation and the trust humans have in ACPS can be significantly enhanced if the automation provides information about its purpose, process, and performance in a human-understandable way. Governance focusses on the societal level asking which agents (voters, policy makers, stake holders) want which type of governance of ACPS for what reasons. Understanding the social embeddedness of ACPS becomes most viable in the domain of safety-critical system. These are systems whose failure or malfunction result in the death or serious injury to people, the loss or severe damage to infrastructure or severe environmental harm. SEAS concentrates on the domains of mobility, energy and health. All three domains are characterized by mission-, safety-, and time-critical distributed decision-making across the human-ACPS boundary under partial observability, imperfect information, and with individually widely varying skill levels of participants.