I have another project that I am working on that is aimed to help autistic children to understand emotions by interactive learning and mimicking via gameplay. The concept is pretty simple – the child has a face that is shown, and they mimic the face, upon correct expression, the child is awarded a point, and words and/or audio say something like ” This is a happy face. I feel happy when I see my mummy or daddy”.
I started by using a programmed called ‘Wekinator’. Wekinator is a machine learning software which basically allows me to put in inputs, and train it, so that it can give me outputs. There is so many things that you can do with Wek – you can train it to output audio when you move an object on a screen or even yourself via webcam amongst other stuff, but what I have done is train it to record several states of my face, e.g. expressions (See video below).
From this, I am able to create a class of different facial expressions such as happy, sad, angry etc and bring this into OpenframeWorks to create a bunch of if statements and for loops to execute something once each face is expressed.
The inspiration for this is from my earlier post, which you can see here.