Panopticon captures and examines video data

What did the team learn from recently capturing and examining video data of people in the lab playing Hotline Miami and Tekken 2?

We noted that Players exhibited a number of different behaviours based on what was happening in the game, including leaning forward when concentrating, and leaning back when tension was released. They tended to smile frequently – particularly when key game events occurred.

When playing the two-player fighting game Tekken 2, they displayed more facial expressions, and also interesting so when the single player recognised that someone was standing behind them!

These findings have led the team to consider what can be learnt about the experience of a game from a sequence of facial expressions pulled.

The Panopticon project has deployed a data capture set up at the National Videogame Arcade (NVA), aiming to capture sufficient video data of people playing, so that eMax can be trained to better recognise player behaviours and expressions automatically.

Photo by Element5 Digital from Pexels

If you’re visiting the NVA this week and want to participate in the Panopticon research project, or want to learn more, look out for us in the foyer!

 

Privacy, Law & Ethical Cross Cutting Theme update

In order to reflect on impacts for wider human values and embed safeguards into technologies being introduced by Services Campaign projects, Peter and Lachlan have been holding workshops with members of Memory Machine, In My Seat and Panopticon. These workshops used the Moral-IT and Legal –IT cards developed as part of the Towards Moral-IT and Legal-IT research ongoing at Horizon Digital Economy Research.

 

As mentioned in the previous blog, the Legal-IT cards translate a range of data related legal frameworks into card form, from the new EU General Data Protection Regulation 2016 and Network and Information Security Regulation 2016, to the earlier Cybercrime Convention 2001. The Moral-IT cards pose difficult ethical questions clustered under the themes of privacy, security, law and ethics, such as “IDENTITIES MANAGEMENT: does your technology enable users to hold and manage multiple identities?” or “SUSTAINABILITY AND eWASTE – What effects does your technology have on the environment from creation to destruction?”. These thought provoking questions help participants to think of unexpected implications of their technology.

During a workshop, participants were asked to reflect on the technology they were building and identify an overall ‘ethical risk’ that may impact the social desirability of the technology and for its users, particularly in relation to its use of personal data. This could include the identity risks from sensitive data being compromised by poor data security practices, or personal privacy harms for individuals’ private details being made visible to unexpected parties.   The groups used the Moral – IT and Legal-IT cards in a streamlined ethical impact assessment process to reflect on the overall risk, discuss and identify potential safeguards against these risks and also identify challenges of implementation of these safeguards. This activity resulted in a wide range of critical ethical questions being explored in relation to the technology with the cards and structure of the task enabling the participants to navigate the difficult ethical questions and link their technology to ethical and legal concerns more widely.

The cards were also used as part of workshop run by Lachlan and Martin Flintham, as part of their Digital Research funded project, to generate thought and encourage discussion about ethical implications of using the ‘Internet of Things’ in both the university and research environment.

We were pleased with how the participants took to the cards. They enjoyed using them and found them helpful in exploring and engaging with the ethical and legal issues in relation to their technology. They were received well, with their utility in structuring debate around complex topics. However, they also brought the wide range of issues to the fore. We are therefore encouraged that the cards have the potential to be a particularly useful tool in enabling technology developers and users to reflect on and navigate the complex ethics of their technology and produce more socially desirable technology as a result.

If you would like to know more, the Moral-IT and Legal-IT cards, and an outline of a way to use them, are now available to download online from ‘Experience Horizon’– a website which provides opportunities to try out some of the outputs from projects conducted at Horizon Digital Economy Research. If you do choose to investigate them further, we would really like to build up dialogue on who you are, how you are using the cards, why and any feedback you have on the tool/process. Please send these on to lachlan.urquhart@gmail.com.

Members of the Privacy, Law and Ethics Cross Cutting Theme are planning analysis and preparing a paper to submit to the Journal of Responsible Innovation, towards the end of the year.

Finally, following a presentation of the project, the Privacy, Law and Ethics Cross Cutting Theme project has been neatly summed up in a visual form as can be seen below – our thanks and acknowledgment to Rikki Marr of HAWK&MOUSE.

Written by: Lachlan Urquhart and Peter Craigon

Media and Memory

For the entirety of my adult life, I’ve been studying culture, based on the conviction that media products (however ‘mindless’ and ‘disposable’ many claim them to be) play an incredibly valuable role in all our lives. This is because they are bound up inextricably in our wider experiences of the world, of other people, and in our emotional reality.

It’s easy to identify moments from my own life that illustrate this point. Anaesthetising my teenage anxiety, while I waited to hear if I’d got my University place, by concentrating instead on the characters in a favourite book, Annie Proulx’s The Shipping News. Escaping to Middle Earth after my PhD examination by watching the Lord of the Rings Trilogy (extended editions) back-to-back. Euphorically dancing around the flat with my newborn daughter to Paolo Nutini’s Pencil Full of Lead singing “best of all, I got my baby”.

These formative experiences that stick in my memory are linked to and enriched by the media I consumed in those moments. And perhaps the stickiness of those memories is reinforced every time I encounter that content again. Certainly particular pieces of media trigger particular memories, and that nostalgia can be quite visceral. For example, Beyoncé’s Crazy in Love reimmerses me in another swelteringly hot summer – 2003 – when that hit single seemed to be continuously blasting through the open windows of every vehicle in London.

Lots of amazing, imaginative work is being done to take advantage of the propensity of media to ‘transport us’ in time and space, especially when memories and/or media become harder to access. The WAYBACK is a virtual reality film, funded by £35,000 pledged to a Kickstarter campaign, that recreates Coronation Day 1953 to help those living with Alzheimer’s and their carers recall the conversations, music and atmosphere of a street party. In situations when people’s cultural worlds become restricted, digital apps can also help maintain access to content and all its benefits. Armchair Gallery is developing an app to enable digital access to, and creative interaction with, artworks in collections for those who cannot physically visit them.

What excites me about the Memory Machine idea is imagining an in-home media repository cum player that could automatically connect personally important content (e.g. a pop song) with a period of time (e.g. when you added it to your music collection or listened to it a lot) and with other contemporaneous media (e.g. a film or advert of the time that featured the song). This has the potential to generate multi-layered, multimedia connections between individual and historical context. More than that, a system that could link one person’s cultural experiences with those of people around them would also transcend the artificial limitations we all apply to media on the basis of personal taste. I think it would be wonderful if my daughter could one day, as an adult, get a sense of the love and joy she brings me by being played a pop song from ‘before her time’.

Written by Dr Sarah Martindale

In My Seat Workshop 3: Prototyping a new digital service for public transport users

 

We are creating a digital experience which will make your bus journey more enjoyable and more interesting by linking you to various types of content, including local information, mini-games, and user-generated content, through your specific seat or vehicle.

You are invited to take part in a workshop on Wednesday 15th August at 2pm, taking place in A19 of the Nottingham Geospatial Building on Jubilee Campus, University of Nottingham. The workshop will last approximately 1 and a half hours.

The aim of the workshop is to interact with, refine, and feedback on paper prototypes and mock-ups of the service which will then feed into the development of the app.

We are particularly interested in

  • When and how you would access different types of content
  • How you would switch between different ‘modes’
  • What information you would save and/or share with other travellers

This will involve paper-based activities and discussion, and you will be thanked for your time with a £10 high street/Amazon voucher.

For more information, and to sign up for the workshop, please email Dr Liz Dowthwaite

 

Panopticon

The Panopticon project is developing a system for measuring how visitors to the National Videogame Arcade in Nottingham engage with the various wild and wonderful games that they exhibit in that space. We’ve taken the name from the infamous prison design where occupants were watched at all times by a single guard, however the modern twist is to involve the visitors to the NVA in a meaningful conversation and decision about what’s done with the data they generate.

We’ve spent the first couple of months thinking about and beginning to develop our engagement tracking system. This is a computer vision based system that, using cameras mounted on various games in the NVA, will watch and measure how engaged players are. That’s more complicated than it sounds. Firstly, the NVA has lots of different kinds of game and exhibits that players and visitors can explore. We did an initial site visit with Exhibition Manager Alex, and came to the conclusion that we want to experiment with tracking engagement with three different kinds of exhibit:

  • Arcade games with one or two people stood in front of a display playing the game
  • Room scale games of which the NVA has a few that involve players physically moving and leaping around
  • Traditional exhibits kept in glass cases that visitors can peruse

The variety here leads to some challenges for how we can measure engagement. The technology we’re developing is built into a small form-factor PC and camera that can be attached to the different exhibits. The tech will watch and interpret the body poses of the people using the exhibit and how they’re standing or moving, but also what kind of facial expressions they’re pulling, for example are they laughing, focused, or excited.

To be able to make sense of what the camera sees we first need to train the technology, or rather for the technology to learn. We’ve built a temporary gaming booth fitted out with cameras in the Mixed Reality Lab, and the next step will be to invite people to visit and spend half an hour playing on an Xbox to capture the initial data that we need. Using this data we can build a model of  measurable engagement that we can use to subsequently measure the engagement of visitors to the NVA.

But this is only half of the picture. The other work we’re doing is to figure out how to have a conversation with the player about the data that we’re capturing rather than just wholesale capturing everything, which most people would quite rightly think was overly intrusive. Each player is given a token that they can use to explicitly signal that they are engaging with the game, and which also acts as an access token to the data that is being captured. This token, and whether the player chooses to give it to someone else, or even gift it to the NVA, drives the conversation about data ownership. We’re exploring some lightweight NFC tags that can be easily integrated into a variety of form factors.

 

We’re having some interesting conversations with the NVA about what the servoken should look like, and prototyping possible ideas using additive manufacturing. It could look like a coin, inspired by the old coin operated arcade games, but then we don’t actually want players to lose their tokens inside a machine. Or it could look like Portal’s Companion Cube, where the material emancipation grill erases unwanted data about you when you leave the exhibition.

By Martin Flintham

My Memory Machine

A collection of filing boxes with a post-it note that reads "Dominic's little bits from the past"

In one of those odd cases of Universal synchronicity, as I was struggling to write a blog about memory, my parents arrived the weekend just gone, bearing boxes of my ‘childhood memories’ that they had been saving for me and now wanted me to go through. As I write this blog, those boxes sit next to me on the desk, unopened. I expect for many people their immediate reaction would have been to dive straight in and revel in the nostalgia. For me though, my approach to memory seems to be much more about caution. I’m wary of what emotions a memory might trigger, will I be embarrassed? will I feel sad? Chances are that the feelings evoked will be pleasant ones but I prefer to be in control, to stabilise myself beforehand and to look through when I feel ready. It could be weeks before I get around to going through the boxes.

Although I suspect that this may be the minority approach to memory, I also suspect that it is far from uncommon. In relating my own personal perspective on memory to the Memory Machine, I can see that technology is increasingly becoming an important gatekeeper to memory. Take for example social media platforms such as Facebook, so much of our daily lives are now being stored on these; and these platforms are also taking advantage of these memories, a common feature is to highlight an ‘on this day’ memory. Yet, not always are these memories wanted, a reminder of a bereavement or break-up are quite common; not all memories are equal, they can be important whilst still being negative. It will be interesting in the Memory Machine workshops, and the development of the technology, to see these tensions discussed and see the role that technology can play in safeguarding both memory and an individual’s emotional state.

Dominic Price, Research Fellow, Horizon Digital Economy Research

In My Seat – Would you like to participate in our project?

Do you regularly use public transport in Nottingham?

Would you like it to be more enjoyable/informative?

 We are creating a digital experience which aims to enhance your everyday public transport journeys, making them more enjoyable and interesting by linking you to various types of content, including local information, mini-games, and user-generated content, through your specific seat or vehicle.

You are invited to take part in a workshop on the 25th of April, 2.00pm at the Geospatial Building (room A19), Jubilee Campus, University of Nottingham. The workshop will last approximately 1 hour.

The aim of the workshop is to:

  • develop the types of content that you would find most useful and enjoyable
  • discover how you would like content to be presented to you
  • design forms of interaction between users and the content / experience

This will involve paper-based activities and discussion, and you will be thanked for your time with a £10 high street/Amazon voucher.

For more information, and to sign up for the workshop, please email Dr Liz Dowthwaite

‘Catch and Connect’ on Nottingham Buses

It’s Monday morning and I’ve just caught the 8.52am bus into the city, which I do every day for work. I show my travel card to the usual driver who nods, we don’t speak and I make my way to my usual seat, third row back, facing forward, on the left. I acknowledge one or two of the many faces I see every day, but we don’t speak. Familiar strangers. I get my phone out, put my earphones in, put my playlist on, put my head down and start to disappear into my social media, my emails, my photos, my bubble. I’m immersed, but become aware that the bus has stopped and people are starting to look up and around, no-body seems to know what is happening, but nobody speaks. I notice several people on CityCycles go past, something I’d quite like to try if I knew where or how to hire them. Finally the bus continues and people return silently to their phones. Despite rarely looking up, after making this journey twice a day for 3 years I instinctively know my stop is next, outside the Concert Hall. I often wonder what shows they have on and still keep meaning to find out, but never seem to find the time. I leave the bus, stopping to buy a coffee in the usual place, before walking the last 10 minutes along my usual route down the high street.

Unfortunately, not only can such everyday journeys on public transport take a notable proportion of our day, they can often be monotonous, isolating and largely unfulfilling experiences. What if we could address this situation by offering (bus) passengers ‘dynamic’ and additionally more enjoyable, engaging and relevant digital content, both enriching potential connections within the passenger ‘community’ on-board and also enabling connections to be made with the external environment, en route.  The ‘In my Seat’ projects aims to do this by offering pertinent, personalised passenger-driven content, both stakeholder delivered and user-generated, which can be accessed via a mobile app, linked directly to individual sensors in a passenger’s seat / vehicle.

For the passenger, such rich and context specific content may include a ‘today’s fun fact’ or joke left by a previous passenger, or an on-going, on-board bus game. More practically it may offer real-time notifications of potential delays, information on complimentary, sustainable transport e.g. city bike hire, or alternatively upcoming shows at a local theatre, or lunch time deals at cafes along the route. For the public transport operators and city councils, being able to identify when, where and how many people are using particular public transport modes e.g. buses, is invaluable in being able to ‘evidence’ need and demand and thereby align services and supporting infrastructure effectively.

With this in mind, the project will shortly run a series of stakeholder engagement and also user (passenger) design workshops to frame and (co-)design our initial concept(s) / mobile app. Outputs from these workshops will be subsequently posted on the ‘In my Seat’ blog.

Nancy Hughes, Research Fellow, Human Factors Research Group, Faculty of Engineering, University of Nottingham