MICROSOFT Puppets
Principal Design Manager
2017-2019
Overview
Microsoft Puppets announced in 2019 allows SwiftKey users to create and share videos of their own virtual puppets that mimic facial expressions, head movements, and voice. Choose between five playful characters and share through Android devices.
Role
While managing the Computer Vision (CV) design team within the Microsoft Research (MSR) group, part of my scope was to push my team in finding ways to innovate and productize computer vision technology.
We were very keen to find ways to leverage our face-tracking technology along with expression and emotion sentiment computer vision API’s. We began to experiment with machine learning models using another product I was overseeing - Azure Cognitive Services - Custom Vision, which allows developers to create custom computer vision models with very little training data.
During our concept sprints, we narrowed our focus on an idea to train a CV model that could look at an individual and then deconstruct them into an illustration very much a BitEmoji.
Before we took on the full project scope, we decided to test the capabilities of our Custom Vision product. We began building a simple iOS app and trained a CV model that would identify if individuals were wearing “sun glasses”, “glasses” or “no glasses”. The app would then show users a simple emoji character in one of those three poses.
We then explored other uses of our new computer vision model and began to iterate on other possible prototypes.
These investigations would then lead us to a larger scoped project to track individuals expressions and then map them to animated “puppets”. Soon after we began prototyping this new direction, Apple announced its Animoji technology with the launch of iOS 11 on iPhone X. We felt that Apple had done a killer job after significant investments with their depth sensor camera tech, and we began to question if we would proceed with a similar direction.
At the same time, my team was also overseeing the SwiftKey product, and Microsoft had just entered a technology partnership with Huawei to ship SwiftKey on their flagship devices outside of PRC. Huawei was very interested in our computer vision technology as they too wanted a compelling feature like Apple’s, we would then begin to develop an Animoji compete.
Our technology would have a very different approach leveraging any RGB camera, and we focused on developing this feature within the SwiftKey product to reach not only Huawei users but all our existing BTB and BTC SwiftKey users.
In the beginning stages of the project, my role split between Project Manager, Design Manager, and Art Director, where I hired and directed a 3D artist to help create a series of characters for the first release.
After the project received additional funding and headcount, I was able to hand over my PM responsibilities and focus on managing the overall UX development inside SwiftKey. My team worked directly with the software engineers and program managers to ensure that the puppets met a series of benchmark studies. We needed to test our CV models to ensure they were optimized to recognize a vast range of individuals covering age, gender, race, extreme lighting conditions, and other artifacts such as glasses/piercings.
Outcome
Microsoft Puppets is currently in public beta and has been very well received. The unique technology approach for this project has led to the creation of a CV expression engine and an enhanced facial tracking system. We began to investigate what other groups across Microsoft could benefit from this new approach. Some of the work is still confidential, but I can share that my team began to work with other groups across Microsoft to help with their product/services - Microsoft Teams, HoloLens, Windows Hello & Xbox studios.