Search Museum Next

Museums in an ethical minefield

How far can agency over museum experiences go before it becomes invasive and detrimental to visitors’ privacy? What can we sense; what shouldn’t be sensed? What happens if analysis of sensed data reveals –accidentally or purposely– upsetting information about a visitor? These questions contribute to an emerging demand to rethink our collective idea(l)s about ethical conduct and norms. This talk will address the specific interventions Science Gallery Melbourne has in mind and has previously experimented with, and how our insights contribute to an accessible and inclusive debate about ethics.

Speaker

Dr Niels Wouters
Head of Research and Emerging Practice
Science Gallery Melbourne

This presentation was filmed at the MuseumNext Digital Summit in Autumn 2019.

Niels Wouters: Great. Who gets excited when talking about ethics? Anyone? Okay, like 10% of you guys. That’s all right. Ethics is actually not very exciting, but I think what we are trying to do in Melbourne with Science Gallery and our programmes, we’re trying to make ethics more exciting. We’re trying also to make the discussion about ethics. I think a very urgent discussion that we all need to have as a society. We’re trying to make that a lot more accessible. And I guess today what I’m trying to share is a sort of manifesto for us as galleries and museums to be more unethical. That was a joke, I was hoping for some laughter. I’m not trying for us all to be more unethical, but I’m trying for us all to challenge ethical norms and standards, especially as new technologies, and we’ve seen very exciting opportunities today already, as these technologies permeate the public space.

Just by way of introduction, in case you haven’t heard about Science Gallery, we are a global network. We started out originally about 10 years ago in Dublin. We form a network of galleries that specifically intend to explore the collision of arts and science. We always team up with universities, with key universities in a series of global cities. And we try to tap into this really interesting young audience of 15 to 25-year-olds. We’re trying to expose to them the breadth and the interesting character of STEM disciplines.

As you can see on this map, we’ve got about seven nodes currently in development or already active. Melbourne is currently in development. We’re building this beautiful gallery space as part of an innovation district right in the city centre. It’s approximately 4,000 square metres of gallery space, including a social space, teaching and learning spaces. And then obviously our main gallery spaces. We are teamed up with the university of Melbourne, so we’re also trying to integrate as much of the education and research programmes in the gallery. This is just a shot of the inside of the space. And when talking about ethics, our floor is a galvanised steel floor, so we can technically do a mass electrocution of our visitors.

We’re in negotiations, whether we ever do that or not. We’ll probably have a MuseumNext talk about that one of the other times. But really there is this massive opportunity to talk about ethics and to interrogate ethics very differently. And I think we as museums and galleries, we are uniquely placed to do things like that. I think the discussion about ethics is more and more common. It’s becoming more and more accessible as well because we see some developments, not necessarily in the museum and gallery sector, but on a nation level, for instance I guess what’s happening in China that’s affecting us all at some point in the future and really something that we need to discuss.

I’ll take you through three case studies of our past seasons in Melbourne, and I’ll pick them apart a little bit to see how we’ve tried to help researchers, help academics as well to interrogate the ethical aspects of their work. The first one I want to touch upon is a work called Biometric Mirror, which was part of our season last year, a season called Perfection. It’s a collaboration between Lucy McRae and myself. So I’m slightly biassed here perhaps, but I as an academic and then Lucy McRae for those who don’t know Lucy McRae, she’s a science fiction artist. She’s based in Melbourne and Los Angeles and is really looking at using and adopting and embracing technology as a way to look into the future and to experiment and look at how people will perceive and will experience these futuristic aspects.

What we did is develop an AI-driven installation that takes photos of people’s faces and analyses them. So it’ll tell you how old you look, your perceived gender, very binary, but also a range of psychometrics. So it tells you how attractive you’re perceived to be, how weird, how responsible, how emotionally unstable. And it does that all on large public screens in the middle of the city, in the main shopping street, right in front of a tram stop.

It’s been a really interesting work. There’s a few version of it currently travelling globally, which is really interesting. And the work is semi-controversial. We were all quite concerned about the potential backlash against it, but it was pretty well received. And there was a really good research framework around it as well as to what we really wanted to interrogate, how we wanted people to I guess learn about these new technologies. And I’ve got this little video of my colleague, Ryan Kelly talking about to do work and why it’s really important for us to do this kind of thing.

Speaker 2:  Now it’s easy to laugh at the results or struck them off as just a bit of fun, but there’s more to it than that.

Ryan Kelly: So one of the reasons it’s important to teach people about the limitations of artificial intelligence and these kinds of analyses is because people might assume that because it’s done by a computer it’s objective and correct. And what they might not realise is actually an application like Biometric Mirror draws on a dataset of faces that have been rated by people. And so those ratings contain human biases. And so one example in the dataset Biometric Mirror is that anybody with a beard is classified as aggressive. So of course I am classified as an aggressive person by Biometric Mirror, even though I don’t think I am. I hope I’m not.

Niels Wouters: He’s not, by the way. But really before we set out on making this work and launching it in this massive public space, we thought about all of the questions we wanted to unpack and we wanted to interrogate. And I think we kind of organically developed this lens to look at societal problems of first of all obviously we would talk to visitors of the exhibit. How does this thing make you feel? And I guess you can imagine how it made most people feel, so quite confronting quite daunting and very, I guess people felt very exposed, the fact that their photo and their psychometrics appeared in this big shopping street for 24 hours. But then we helped them along. And these are not necessarily the exact same questions that we asked them, but this is just to give you an idea of how we zoomed out. But we guided them along and we helped them to think about the bigger implications of such technologies.

And coming from Australia, what’s definitely very important for us is like I said before, what’s happening in China with the social credit system, it’s very much a discussion that’s very active in Australia. I don’t think it’s that active in Europe at the moment, but really wanted them to think about that very actively and consider that really and see their own role in that as this future might one day become reality. And also we try to make this phenomenon of ethics a lot more accessible. I’m not an ethicist, I’m an architect. And one of the challenges that I see with ethics is that it’s often studied by ethicists on a university campus as you might imagine.

But very often they sit in this ivory tower and they don’t necessarily engage with society. And that’s one of the fundamental flaws in I think in the current ethics debate, especially when it concerns emerging technologies that are launched and are I guess introduced upon us as a society in a very rapid pace. If you do sit in your ivory tower to study ethics, that’s probably not the best way to study these technologies. And really with a work such as Biometric Mirror it’s really about this last question. We did specifically design it to start a wide and public discussion about surveillance, about artificial intelligence, about facial recognition. So in that sense we kind of amplified the ethical impact and the personal impact of the work as much as possible.

A past season, so our last season, sorry, Disposable. We looked at ethical issues of other phenomenon and I’ve taken a bit of a free approach to the theme digital here, because a lot of these works that I’ll be showing in this section or the two works really, they’re not very digital in nature, but it’s still always around this ethical question. One of the key works of the season was a work called Urinotron. It’s a work by Gaspard and Sandra Gaspard Bébié-Valérian who are French and who developed this specific version of the work in collaboration with one of the university’s academics, Peter Scales, who is in chemical engineering and who has developed this really beautiful urine filtration system that turns it into drinkable water. I didn’t bring a bottle with me today. But as a conversation starter, and especially in Australia, so a country that’s facing unprecedented drought at the moment, it’s very much an active discussion.

In our very initial conversations with Peter he raised awareness to the fact that we wouldn’t be allowed to let people drink the filtered urine, even though it’s cleaner. Yeah, it’s clearer and cleaner than most water that you can buy in a supermarket. And that’s really a theme and an issue that we as Science Gallery, we really wanted to tap into. So we set up a series of panel discussions with legal professionals, with Peter as well, with artists to see, do all of these rules, do all of these laws still make sense with the issues we face today?

The other aspect about the work also was that you could charge your phone with the nutrients in your own urine. And we really handed out cups, thousands of cups to all of the students and academics and staff on campus. Even the vice chancellor donated urine. We had to probably sign a waiver that we wouldn’t analyse his urine sample, which we didn’t to be fair. But it was a really interactive work and really everyone all of a sudden got this opportunity to think about the ethical implications of this. Again, I’ve got a short video of Peter talking about the work.

Peter Scales: So Urinotron’s a Science Gallery pop-up at the University of Melbourne. We’ve got a whole series of microbial fuel cells where we take urine from the public. And we use those fuel cells to generate some electricity and break down the carbon in the urine. And from there we take the spent urine and we put it through a series of columns and membranes and produce plain water. Would you drink it? Oh, great. So have a look, its nice and clean, isn’t it? No, there’s no floating bits. It’s very, very pure water.

Niels Wouters: You can see that Peter had a lot of fun. He’s a retired professor, so he can probably get away with saying all of these things. And again, looking back at some of the questions we wanted to interrogate, you see that we always started with this question. How does it make you feel as a visitor, as someone who has contributed one of your most personal liquids, I suppose? But also how does your contributions change your behaviour? With Disposable specifically it was a season where we wanted to focus on of the positive aspects that also come out of the current environmental crisis. There’s a lot of innovation happening. Obviously we’re all aware that there is global warming, even though some people might say that it’s not a thing in one particular country, presidents. Anyway, I’m getting distracted here, but really wanted to showcase all of the innovation that was happening there.

And for us this was definitely one of the key potential innovations that could radically influence the future. So really what we wanted to interrogate also is where is that public debate? All of these laws are in place. We can’t drink filtered urine for probably good reasons back in the days. But these days all of the mechanisms, all of the innovations have changed and have evolved so much that it’s probably safe to drink these things. And can these techniques promote environmental action? Can it empower us as citizens to think differently and to behave differently when dealing with environmental phenomenon?

The last work I wanted to touch upon, and I think it’s currently being set up as I speak in Science Gallery Dublin if I’m not mistaken. It’s a work called Patch. Developed and really advocated for by the fantastic [Yana Arendt 00:13:08]. It’s a wearable technology project along the lines of some of which we’ve seen today. It’s an environmental pollution tracker. So it tracks your exposure to air quality. And it also allows you to, I guess, leave an impression on how environmentally friendly you’ve been that day, whether you’ve purchased any plastics or not.

We developed a visualisation together with Yana to show the outcomes of this work on a large public screen on campus. So this was actually the main setup of the season Disposable. So right in front of the chancellery building in a key central location on campus, and we scrolled through for patch, so the environmental sensor patches for 24 hours a day with very different results. And even the visitors couldn’t really actively take part. They could still look at the results and could still look at the numbers and the visualisation to understand where in Melbourne at that time it was the most or least polluted and where people I guess, were most environmentally conscious or least. And again, same approach, same questions. How does it make you feel? And is there an opportunity here for wearable technologies such as Patch or others to promote different forms of environmental action and ways for us to share all of these information in crowdsource platforms?

And as I said before, I think today what I want to do is make a bit of an argument why we as museums and galleries are probably very well-placed to interrogate ethics and to challenge ethical norms. And I think there are nine really good reasons why we should all be doing this more. We’re very safe. We provide very safe spaces and very comfortable places for people, for our visitors to reside in. But participation for them is voluntary. If we were to set up a Patch or a Urinotron or a Biometric Mirror in any gallery or museum in the world, people would still be able to participate at their own will and at their own pace. If they don’t want to participate, that’s totally fine, the work will still be there and the questions will still be raised. And you might still be able to answer these questions by interacting with your audience.

But also we’re very fortunate in the sense that we provide manageable environments. With work such as Biometric Mirror we had very intense discussions from day one about how to manage this. What if you’re a young person and Biometric Mirror tells you you’re not attractive and that’s being broadcast on this large screen? Again, you know where I’m going, in the massive shopping street, how will we manage that? And we all work with invigilators. We call them mediators, but young people that can staff an exhibit, and that can give the necessary context. We also set up FAQs online. Obviously because it is research we went through an ethics application procedure. We made sure that all the necessary boxes were ticked, while still challenging all of these ethical frameworks and guidelines that are in place.

A really interesting one is that we all address and reach a very diverse demographic, even Science Gallery, even though our target audience is the 15 to 25 year olds, we obviously still address others outside of that range, outside of that category. And it’s really important to have all of these opinions collected and bring them all together as a way to start thinking about where ethics will take us, or where technology, sorry, will take us.

And then a museum is a trusted space I think. In times of fake news we are probably one of the last few venues where you can be certain that everything is safe and is reliable and is trustworthy. And it’s definitely an opportunity for us too. If we have challenging technologies on display or as part of a visitor’s experience, obviously people will realise that this is well contained. It’s well taken care of. It’s well studies and properly implemented with all the safety mechanisms in place.

But then obviously museums art. We’ve always been spaces for critical reflection and reflection about where society is going to be taken next. Artists have always challenged ethical norms and they will and should keep doing so. So we provide this space where people can reflect in their own terms and share these reflections with us or not. And then along the same lines of this voluntary participation there’s various degrees of engagement. You could be the one donating your urine through Urinotron. So a very active engagement and charge your phone as a result of that. But you could also be just this passive bystander who’s looking over people’s shoulder and trying to understand what’s happening. Maybe have a chat with one of the mediators or invigilators. Have a look on the website and the programme to understand what the work is all about and still develop an opinion and an idea about where the ethics lie of such a technology and such an innovation.

And then obviously we should and we can look into the future, and we’ve always been spaces for speculation. We’ve always been spaces where yeah, we’re really open for new discussions to emerge out of the standard discussions that might happen elsewhere. And I think we all thrive in this constant state of flux where we’re already an always at with what we provide.

And really, I think it all comes down to me for this aspect of new cultural experience. And it takes me back to one of the quotes, or one of the things Lucy McRae said when we first got together and sorry, I have to look at my notes, but she said, “We must use public spaces. We must use galleries and museums to talk about how technology changes our consumption of culture.” And I think that’s absolutely right, but I’d also like to flip it. And I think we should look at public spaces and use museums and galleries and the exhibits in these galleries to kind of look at how technology changes culture and society, which is really the opposite thinking pattern.

And for me it all comes down to this and you might recognise that illustration from all of the ethics questions that I mentioned before. But really we have this opportunity to look at technology and how people interact or don’t interact with technology and think about technology within the single individual exhibit and within the safe zone of our museum. But it does give us an opportunity to also think bigger picture and look at what the societal implications would be when a technology, say a Biometric Mirror, is let loose on society.

And look, my background is not in museums. I failed to mention that in the beginning. So I had to do a lot of research around where or how the ethics debate in museums is typically formed. And according to John Henry Merryman, there we go. There are two main lines of ethics inside in museums. There’s the governance ethics, that’s how you run a museum ethically. And there’s the acquisition ethics, which I don’t think I have to explain that. But I think there’s a real opportunity to also have this third aspect, which is all around programme ethics. How can you make sure that your programme, so your season, your exhibit as a whole, your marketing campaign, your digital campaign, how can you make sure that that’s ethical in its own right? But how can you also look for opportunities to contribute to this very urgent ethics debate?

And obviously these are all intertwined. I don’t think any of these can work safely on its own, but it does give us an opportunity as galleries and museums to take up a role as ethics stewards, and kind of influence and drive the ethics discussion as well in a variety of ways. So we can tell society about where technology is going to take us next or where it isn’t. We can inform ethics review procedures. In our case that’s very much the ethics review procedures with the universities. But at Science Gallery Melbourne we’re also looking at setting up our own ethics review panel to make sure that our works are ethically aligned and the whole research process if there is one is ethical in nature, there is no harm for our visitors. And obviously that ties into this whole idea of risk assessment.

For me more and more ethics review and risk assessment are very much the same thing. I mean, risk assessment has to do with physical things, in our case mostly is it safe at night to have Biometric Mirror in Swanston Street in Melbourne? What if it rains? But there’s also an ethical risk of having unethical work on display. Obviously that feeds into aspects around data analysis. What can we do as a gallery with the data we collect from our research, from people interacting with Urinotron? Again, like I said, we don’t have any urine samples. But it does make us think about how to interact with all these things and deal with these things safely without harming people. And then ultimately something that we’re working on as a result of Biometric Mirror is very much policy development in terms of artificial intelligence, facial analysis, and recognition that’s currently being implemented in Australia.

That was pretty much what I want to just say, but in case you’re keen to find out more about our next exhibition season, we currently have an open call running for our opening season. So that nice, beautiful gallery that you saw in the renders at the start that’ll open sometime or early 2021. Our opening season will be all around mental health and wellbeing. But as you’ve seen from the three cases, we do like a challenge from time to time. So yeah, keep us in mind and feel free to chat today as well. Thank you.

Sarah: Thank you, Niels. Great presentation. Great examples. How did you facilitate the debate that the installations were designed to provoke?

Niels Wouters: How did we, sorry?

Sarah: Facilitate the debate, internally probably, that the installations you were designed to provoke?

Niels Wouters: That’s a really good question. And we worked very hard with our own internal comms team within Science Gallery Melbourne, but also the university’s comms team. We sent out a press release before Biometric Mirror specifically went live because we did acknowledge that it was potentially very harmful. We had a FAQ online before the installation went live. We had interactions with the university’s psychology and counselling services in case a student or a citizen would file a complaint. So we really had everything in place to deal with issues if they would present themselves. Luckily, no issues at all. I think we got one email from a person who wanted data to be removed, but yeah, we had mechanisms in place to deal with that easily.

Sarah: Okay, so that sounds very good. How would you love to see the conversation about ethics within the cultural sector go beyond Melbourne? Do you try to influence the daily management as well?

Niels Wouters: That’s a very good question. Obviously, Melbourne is just this small faraway town the other side of the world. But yes, I think that discussion needs to extend beyond Melbourne. I think platforms such as MuseumNext are really great platforms to start a public discussion about ethics. It’s probably also very much in the spirit of what we do as museums. It’s I guess in fighting research in when you feel that some technology might be ethically provocative and communicating outward. I think what Peter Young and Ava are doing for instance, communicating these outcomes and findings later on publicly. I think that’s fantastic. I’m happy, or I’m lucky enough to speak Dutch, but in English would be fantastic just so it reaches a massive audience, yeah.

Sarah: Was there a big difference in the feedback from visitors to the three exhibitions? And were you surprised by the feedback?

Niels Wouters: Yeah, we were obviously because of the nature of the works that were very different. Biometric Mirror had this massive tech focus. Urinotron had this massive environmental urine focus and then Patch was slightly more technical, but also very environmental. So obviously we attracted a very different audience and that’s also what we intended to do. With Perfection we did want to tap into this market of the Instagram young people, and that’s very much the audience we got. They were all taking pictures of themselves inside the Biometric Mirror booth, whereas Patch and Urinotron, I guess they were more engaged, a more engaged audience.

Sarah: Yeah. And it’s interesting. I was thinking about social media and I mean that’s also a public space in a way you’re talking about those nine points you had and what makes a museum or a science gallery such a safe space to work in. And did you hear that at Instagram now they’re going to remove this plastic surgery filter so that when you have Instagram you can change your face?

Niels Wouters: Uh-uh (negative).

Sarah: They changed it. Actually they have to remove it probably.

Niels Wouters: Okay, fantastic.

Sarah: So, I mean, did you hear about that and what do you think of that?

Niels Wouters: I didn’t. I think it’s great. Look, I’m not very good at Instagram at all. I went to an Instagram museum yesterday in Amsterdam.

Sarah: Oh, you did?

Niels Wouters: There’s an experience centre. Imagine me surrounded by teenage girls going through that experience. I’m still processing it.

Sarah: Are you okay?

Niels Wouters: Look, I think Instagram is very important. It’s a very important platform for young people to engage with the outside world, but it does pose a lot of challenges and it’s definitely something we wanted to highlight as well.

Sarah: Yes, all right. Well thank you very much.

 

Related Content

The Lesser of Two Evils: The Ethical Conundrum of Deaccessioning

It was a case of the lesser of two evils, but a year on its worth looking back at the case of Baltimore Museum of...

Responsible data management: keeping your Museum on the right side of the law and ethical considerations

In any cultural establishment, the collection of data must remain responsible and consensual at all times We’re all aware of the importance of data protection...

Using ethical AI to understand audience behaviour and drive informed innovation

Andy Crouch explains how the team at Akumen is utilising “ethical AI” to better understand the emotions and behaviours of audiences through their own words....

Subscribe to the latest museum thinking

Fresh ideas from museums around the globe in your inbox each week