TY - JOUR
T1 - Can Merging a Capability Approach with Effectual Processes Help Us Define a Permissible Action Range for AI Robotics Entrepreneurship?
AU - Kamishima, Yuko
AU - Gremmen, Bart
AU - Akizawa, Hikari
PY - 2018/2
Y1 - 2018/2
N2 - In this paper, we first enumerate the problems that humans might face with a new type of technology such as robots with artificial intelligence (AI robots). Robotics entrepreneurs are calling for discussions about goals and values because AI robots, which are potentially more intelligent than humans, can no longer be fully understood and controlled by humans. AI robots could even develop into ethically “bad” agents and become very harmful. We consider these discussions as part of a process of developing responsible innovations in AI robotics in order to prevent catastrophic risks on a global scale. To deal with these issues, we propose the capability-effectual approach, drawing on two bodies of research: the capability approach from ethics, and the effectual process model from entrepreneurship research. The capability approach provides central human capabilities, guiding the effectual process through individual goals and aspirations in the collaborative design process of stakeholders. More precisely, by assuming and understanding correspondences between goals, purposes, desires, and aspirations in the languages of different disciplines, the capability-effectual approach clarifies both how a capability list working globally could affect the aspirations and end-goals of individuals, and how local aspirations and end-goals could either energise or limit effectual processes. Theoretically, the capability-effectual approach links the collaboration of stakeholders and the design process in responsible innovation research. Practically, this approach could potentially contribute to the robust development of AI robots by providing robotics entrepreneurs with a tool for establishing a permissible action range within which to develop AI robotics.
AB - In this paper, we first enumerate the problems that humans might face with a new type of technology such as robots with artificial intelligence (AI robots). Robotics entrepreneurs are calling for discussions about goals and values because AI robots, which are potentially more intelligent than humans, can no longer be fully understood and controlled by humans. AI robots could even develop into ethically “bad” agents and become very harmful. We consider these discussions as part of a process of developing responsible innovations in AI robotics in order to prevent catastrophic risks on a global scale. To deal with these issues, we propose the capability-effectual approach, drawing on two bodies of research: the capability approach from ethics, and the effectual process model from entrepreneurship research. The capability approach provides central human capabilities, guiding the effectual process through individual goals and aspirations in the collaborative design process of stakeholders. More precisely, by assuming and understanding correspondences between goals, purposes, desires, and aspirations in the languages of different disciplines, the capability-effectual approach clarifies both how a capability list working globally could affect the aspirations and end-goals of individuals, and how local aspirations and end-goals could either energise or limit effectual processes. Theoretically, the capability-effectual approach links the collaboration of stakeholders and the design process in responsible innovation research. Practically, this approach could potentially contribute to the robust development of AI robots by providing robotics entrepreneurs with a tool for establishing a permissible action range within which to develop AI robotics.
KW - AI
KW - Capability approach
KW - Effectuation
KW - Entrepreneurship
KW - Responsible innovation
KW - Robotics
U2 - 10.1007/s40926-017-0059-9
DO - 10.1007/s40926-017-0059-9
M3 - Article
AN - SCOPUS:85058555987
SN - 1740-3812
VL - 17
SP - 97
EP - 113
JO - Philosophy of Management
JF - Philosophy of Management
IS - 1
ER -