National Australia BankYourself
We created an interactive, voice-activated experience to help people start to reflect on what they really want in life. Combining the latest technology for facial analysis and speech recognition, the web experience allows users to create art through the use of technology.
During the production phase, I've had to coordinate and lead the development team to deliver on the experience, keeping in mind security and scalability during the whole process.
The second step is to start talking to it. As people speak to the illustration, their voice is processed using speech analysis and natural language processing, understanding what they’re saying and transforming the main themes into illustrated patterns that fill up the different parts of the face in real time.
The experience also gives NAB the ability – with the consent of users – to collect data and possibly even reach out to customers with support to make their dreams happen.
Speech analysis uses both the native Web Speech API, and IBM Watson Cloud services, to convert people’s voices into text transcripts. Then, we analyse the text transcripts in real-time using natural language processing, to extract main themes and metrics, transforming what the person is saying into animated illustrations. Symbols, shapes, positions, and colours of the patterns that fills up the different part of illustrated faces, are driven by voice data.
Designed to be highly available and scalable, the infrastructure fully relies on AWS services. All back-end computing logic is built Serverless, using full power of ephemeral containers and functions.
The web experience is built as a Progressive Web Application and is WCAG 2.0 AA compliant. Which makes it fully accessible for auditive, cognitive, physical or visual disabilities.