In Q2 of 2016, I was invited to lead the interaction design for an internal Google tool called "ADA," or "Android Device API." By taking a look at the larger Android data ecosystem, it became apparent that there wasn't a single source of truth for all the various Android Device data points coming from and being sent to Android device partners. These partners included device manufacturers, data carriers, retailers, and Googlers themselves. In response, ADA was born. I worked on a team comprised of 7 individuals; 3 project managers, 2 engineers, 1 visual designer and myself. I jumped into the project from the early stages of when the requirements were still unclear up through the successful delivery of a fully-spec'ed UX system upon which the visual would be added and the product would be built. The product is still in development today.
I kicked off the project by collaborating with the technical lead. We communicated at length about the core requirements and he passed all existing documentation outlining customer expectations of the software to myself. I carefully reviewed these documents to ensure I understood the system from end-to-end in preparation for the system's user experience design.
With the information I gathered from the documentation and collaboration, I synthesized the requirements of the system under a series of user profiles and user stories. I made notes of the core users of the system, and the features each user would expect to access when engaging with the system.
Given that the system had various levels of user permissions, I created a series of user flows that charted out the application's areas and made notes on which user had access to which areas of the system. This made it clear to developers and stakeholders which areas of the system each user could access and which were to be restricted.
After understanding the system's users and core features, I moved the designs into a low-fidelity wireframe. These wireframes represented the bare-minimum UI needed to satisfy each previously designated user story and system requirement.
Once the overall feel and flow of the wireframes was ironed out, we moved the low-fidelity wireframes into a state that was more recognizable to the common consumer with Material Design-styled wireframe elements. This helped project managers and every-day users better grasp the UI.
It was then time to get testing the new interface. We moved the designs into a clickable prototype software. We leveraged the prototype to gather early feedback on the system's design. The feedback helped us refine the interactions with closer-to-real-world user behavior, by clicking through the experience.
Now that the desktop version passed major hurdles, we moved into ensuring each page was mobile optimized. While the core use-case of the system is not one done on a phone, it was important to consider how the experience would unfold on a mobile device. This way we could be sure the UI would support all core action, even from a mobile device.
Lastly, I moved each wireframe (desktop & mobile) into a document where I added explanations on user interactions throughout the software. This documentation made clear the technical requirements of the system to developers as they built the application.
In retrospect, leading the interaction design for Google's Android Device API was a larger than initially expected task. The wireframe themselves went through nearly 10 different iterations. With how central this software would be to the Android device ecosystem, there wasn't room for less-than-extraordinary results. I'm pleased to say the results were well received by the team and the UX workflow was seamless throughout the entirety of the project.