Till startsida
Webbkarta
Till innehåll Läs mer om hur kakor används på gu.se

AI Ethics: Trust in humans and machines

Seminarium

Welcome to a seminar under the theme: AI ETHICS at Chalmers, with the speaker Jonas Ivarsson, division of Informatics at the department of Applied Information Technology at the University of Gothenburg.

The discussion will depart from a sociological understanding of how trust is organised in the interaction between people. From this perspective, trust is essential to human sociality, in fact, many of our social structures partly rest on this fundamental principle. From this backdrop, we seek to investigate the development of artificial intelligence. When machines are given agentive functions they can begin to emulate the forms of concerted actions that we are familiar with from ordinary life. Examples of such human-machine interactions will be given through empirical illustrations. These cases raise the philosophical question of what it means to ¿act together¿. They also highlight ethical issues relating to how we ought to design technology, that operates alongside or even together with humans¿so as not to erode trust, and by extension, the foundation of our society.

About AI ETHICS at Chalmers
A series of seminars highlighting ethical perspectives of artificial intelligence. The series will feature invited speakers and Chalmers researchers with the aim of cultivating an informed discussion on ethical issues. The seminars are organised by the AI Ethics Committee , within Chalmers AI Research Centre (CHAIR).

Föreläsare: Jonas Ivarsson

Datum: 2020-03-17

Tid: 13:15 - 14:15

Kategorier: IT

Plats: EB, lecture hall, Hörsalsvägen 11, EDIT trappa C, D och H

Evenemangslänk: Läs mer om AI Ethics: Trust in humans and machines

Kontaktperson: Simon Ungman Hain

Sidansvarig: |Sidan uppdaterades: 2017-05-02
Dela:

På Göteborgs universitet använder vi kakor (cookies) för att webbplatsen ska fungera på ett bra sätt för dig. Genom att surfa vidare godkänner du att vi använder kakor.  Vad är kakor?