Jan 072014
 

Ethics in ICT matter. And the competences to develop solutions to the infoethics challenges of the early 21st century are likely to be increasingly needed.

First of its kind, a one-day workshop on ethics and social accountability for ICT was held at The British Computer Society (BCS) London Office on October 22, 2013. This was a joint initiative between BCS ICT Ethics Specialist Group and IFIP’s Special Interest Group 9.2.2 on Ethics and Computing and IFIP’s Working Group 9.2 on Social Accountability and ICT. It was led by the two chairs, Penny Duquenoy and Diane Whitehouse.

A highlight of the day was an ethics training session. It was led by Professor Don Gotterbarn of the Association for Computing Machinery: “Let’s look at some difficult ethical situations relating to ICT, and apply some techniques to facilitate identifying ethically and technically appropriate responses,” said Professor Gotterbarn. This interactive training looks like an interesting technique to draw out constructive – and sometimes surprising – solutions among ICT people from all sorts of backgrounds.

Some 30 attendees debated the issues emerging. A dozen speakers, who came from Finland, the Netherlands, the UK and US, tackled critical topics. They focused on the technical and practical contexts. Chief among the challenges – at least for the time being – were identity, privacy and security.

Richard Taylor of the International Baccalaureate addresses the group

Richard Taylor of the International Baccalaureate addresses the group

Yet, in social and societal terms, the day ended with a focus on ethics and ICT in education, research and professional and commercial organisations. Brainstorming garnered many ideas for ways and means of engaging with and influencing stakeholders. The three groups are planning to take these forward together.

Attendees relaxing at the end of the day

Attendees relaxing at the end of the day


Workshop Programme

BCS_ICT_Ethics_SIG IFIP

A one-day workshop on ethics and social accountability for ICT held at The British Computer Society London Office. A joint initiative between BCS ICT Ethics Specialist Group and the International Federation of Information Processing (IFIP) Working Group 9.2 on Social Accountability and ICT and Working Group 9.2.2 Special Interest Group on Ethics and Computing

Tuesday 22nd October 2013, from 9.30 am-4.30 pm.

Abstracts

Societal and ethical implications of ICT – IFIP: Diane Whitehouse and Penny Duquenoy, IFIP WG9.2 and SIG 9.2.2 with Denise Oram, SIG 9.2.2

A brief introduction to the work of IFIP TC9, working group 9.2 and SIG 9.2.2 on the societal and ethical implications of ICT. [201310_Whitehouse&Duquenoy&Oram_Societal and Ethical Implications PDF, 3.7Mb]

Mediating between Technology and its Social Consequences: Denise ORAM, Gwyndyr University, United Kingdom

How do we mediate between moving forward with technology and the societal consequences it may create; some of which are currently unpredictable? There appears to be a gap between the speed of technological advance and our understanding of its implications. Are we considering the consequences?

Big Data: Size matters: Stephen RAINEY, St. Mary’s University College, UK

Big Data poses a problem for the users of online communications in the sense that these users’ every input leaves traces. These traces are ‘big data’ and are typically used in ways not intended by the initial user who made them. The problem to be here pursued is that big data cannot actually be said to refer to anything in particular until these secondary uses occur. In other words, big data is essentially transformational until operated upon. Ethical issues arise here as the use of data without consent and out of context can lead to misinterpretations. So, the translation of data to information is an area of great import and must be examined. The further move from information to knowledge is then another problematic one in need of scrutiny. Big data, stands for nothing until interpreted. These interpretations must be subject to examination in order to 1.) understand what information big data can subserve and 2.) what information gained via interpretation of big data can and ought to be used for or relied upon, especially in terms of knowledge claims. [201310_Rainey_Big Data PDF, 962kb]

A permanent war-like Information Society? Paul DE LAAT, Gronigen University, Netherlands

ICTs are impoverishing us as persons and undermining our moral skills; ICTs for surveillance and hacking / cracking plunge us into a permanent war-like Information Society. How should ICT-professionals respond to these challenges? [201310_deLaat_Philosophy of STS PDF, 83kb]

Racist cameras: a case for understanding the requirement for IT-Ethics: Kai KIMPPA and Olli HEIMO, Turku University, Finland

Facial recognition applications are typically designed in organisations mainly consisting of white (male) engineers. Thus, the original testing done for accuracy is also done with the same segment of the population. Unfortunately, the testing does not always extend to other segments of the population even later, and thus both false positives and negatives in the overall population exceeds the amount found in the laboratory – and this causes the systems to both target and miss those of origin different to the designers. Thus the engineers need education to become aware of the effects the applications they design can have on different stakeholders.

Interactive ethics session: Some Techniques for Computing Professionals to Address Ethical Surprises: Don GOTTERBARN, Chair, ACM Committee on Professional Ethics, USA

Frequently well-intentioned professionals are surprised by the negative ethical impacts of their work.  When some of these potential negative impacts are anticipated before development the technical professional is faced with the difficult task of sorting through the critical elements of a situation and then identifying a best (both technical and ethical) course of action. In this interactive session we will look at some difficult ethical situations relating to ICT, and apply some techniques to facilitate identifying ethically and technically appropriate responses.

IFIP’s work on ethics and social responsibility – an update from UNESCO: Diane WHITEHOUSE and Penny DUQUENOY, IFIP WG9.2 and SIG9.2.2

A brief background on IFIP’s own activities in the two domains of SIG 9.2.2 the framework of ethics of computing and WG9.2 on social accountability and computing, and a brief insight into the latest developments from UNESCO on ethics relating to IT and society. This presentation will avoid an oversight of current European Commission-related plans on responsible innovation, since that will be taken on up by Aygen Kurt-Dickson at the end of the day. [201310_Whitehouse&Duquenoy_UNESCO PDF, 4.5Mb]

Pseudonymous data: Peter SINGLETON, Cambridge Health Informatics, UK

A pseudonym is used to hide a person’s true identity; ‘pseudonymous data’ is data where the identity of the subject is hidden, though not necessarily unrecoverable, so is properly ‘personal data’. This paper discusses the issues of use and possible re-identification of pseudonymous data in the context of the current draft EU Data Protection Regulation and recommends that some relaxation of the full requirements for identifiable ‘personal data’ should be considered for ‘pseudonymous data’. [201310_Singleton_Pseudonymous Data PDF, 123kb]

Truth about deception – detection in court testimonies: Anna VARTAPETIENCE and Dr Lee GILLAM: University of Surrey, United Kingdom

It is now possible, to an extent, to infer the age and gender of an author using automated/semi-automated systems (a task referred to as author profiling). Approaches have been developed for identifying whether the author of a document is who s/he claims to be, and even reveal the true identity of a secret author (author identification or verification). Promising work has also been done on detecting cyber-predators in chat logs. However, are such systems suitable to help us ascertain whether someone is “telling the truth, the whole truth, and nothing but the truth?”. In this presentation, we will discuss these various approaches, and specific recent research in this direction, to explore the possibility to identify deception in court testimonies. [201310_Vartapetience&Gilliam_Truth and Deception PDF, 1.3Mb]

Privacy seals: tools of social accountability? Rowena RODRIGUES, Trilateral Research & Consulting LLP, United Kingdom

Privacy seals enable organisations to demonstrate respect for privacy and develop a trustworthy image. Their importance has been recognised at the international, European and national level. Specifically, the proposed General Data Protection Regulation (that will supersede existing Data Protection Directive 95/46/EC) in Recital 77 suggests that certification mechanisms, data protection seals and marks should be encouraged to allow data subjects to quickly assess the level of data protection of relevant products and services  and to enhance transparency and compliance with the Regulation. Based on research conducted for the EU Study on Privacy Seals (commissioned by the European Commission, Joint Research Centre, Ispra), we will discuss how privacy seals contribute (or fail) to contribute to social accountability. [201310_Rodrigues_Privacy Seals PDF, 73kb]

Computers in schools: predicting the future? Richard TAYLOR: International Baccalaureate, United Kingdom

Over the last 25 years advances in Information and Communication Technologies (ICTs) have led to significant changes in the way that computersare used within an educational context. As schools in many ways reflect a microcosm of society, one consequence for the school community is that the relationship between the computers and the computer users has become increasingly complex with a blurring of the traditional clearly defined boundaries. This blurring may be attributed to many factors, but the continuing miniaturisation, portability and ubiquity of the computer itself is probably the most important. For many students the computer, as well as being a de rigueur fashion accessory, is seen as being indispensible, the thought of having to function without one is something that is hard to envisage. The only certainty in this continuous evolution of computers is that the digital landscape will be in almost constant change. [201310_Taylor_Computers in Schools PDF, 144kb]

Ethical and social aspects of computing: opportunities for research: Aygen KURT-DICKSON, London School of Economics and Political Science, UK

The ethical and social aspects of computing are coming more and more on the horizon – both in terms of the wonderful opportunities offered by technology use but equally the challenges they pose with regard to design and use. Not just companies, but research agencies and research councils too share these concerns, and are looking to fund approach research into these questions. It is likely that this is going to form part of what is increasingly, in Europe, termed “responsible research and innovation”. Based on a selected number of interviews, this talk overviews the evidence base for a practical orientation to these issues. It focuses on how to work with a variety of stakeholders. It poses the questions “where to from here?” It also  scans the horizon to examine some international, European, and UK opportunities for social science and technological research into IT ethics and appropriate IT design, development, and deployment.