One interface, one hundred million end users: SDK product designMay 22, 2020
‘’Oh, a product designer? So, you’re like, making the interface look pretty?’’
No, not quite. If you open your thesaurus, you’ll find that synonyms for a designer return words such as ‘inventor,’ ‘architect,’ ‘artist’ and ‘strategist.’ In reality, we don’t get to design a single mock-up until we define who our users are and what problems they’re facing. And that requires wearing many hats.
Now, designing for SDK can get even more chaotic. Believe us, our design team at Microblink knows this first hand. Our flagship product, BlinkID, is an integral part of some 500 apps ranging from airline operators to financial services providers. This means we’re in a constant loop of researching, designing, testing and reinventing for multiple target groups.
First, our end users who use BlinkID to onboard for products and services more easily by scanning their ID.
And second, our clients who wish to quickly and painlessly add our SDK into their app. Think of it as a puzzle piece they use to attract and engage the above users while streamlining their own processes.
To strike this balance between a user-friendly experience and a client-friendly look, product design needs to evolve alongside technology. Plus, it needs to transfer the brand’s look and feel, so that the entire experience feels on-brand and a part of one app.
A chameleon-like design that blends with its surroundings
Over the years, we’ve noticed a reassuring trend. BlinkID’s user interface, despite being fully customizable, is widely accepted in its default form.
Our clients seem to dig the UX they get out of the box and the majority of them implement it in their apps with only some slight variations in fonts, colors, and other minor elements.
Businesses trust our ability to delight their users with great experiences and we reward them for it. Day after day, we strive to make the technology behind BlinkID more user-centric while keeping the interface visually neutral and less noticeable.
Keeping an AI on augmented reality
In 2017, augmented reality (AR) started reshaping the way we interact with mobile technology. UI was no longer bound by the edges of a smartphone screen, it began leaking out into the real world.
As an AI-powered software, BlinkID was well-positioned to make the most of this trend. Applying visual feedback on the document a user was holding, for example, was a great way to provide a more immersive, natural experience.
Instinctively, we jumped at the opportunity to better engage our end users. BlinkID was already smart enough to recognize and track identity documents so we designed an AR interface that hand-holds people throughout the scanning process.
After showcasing our hard work at Finovate 2019, we were met with applause but it turned out the world wasn’t ready for AR-driven UI.
Thinking outside of the box doesn’t always yield success
Carried away with the wonders of augmented reality, we nearly lost touch with the actual reality our end users face.
Additional development and testing of the demo app revealed that, while our AR interface performed fine on the latest iOS smartphones, it struggled to keep up on Android phones.
This meant that the new interface wouldn’t work for a good chunk of our clients’ users. And since we want to provide an equally enjoyable experience of using BlinkID regardless of the device, an AR UI on high-end smartphones alone was not an option.
Rising from the ashes: The best interface is no interface
Short on time, we thought long and hard about the best approach for reinventing BlinkID’s UI.
Days of brainstorming and extensive research bore fruit. We had come up with an UI and UX that was partly inspired by Google’s Material Design guidelines (patterns for machine learning features), and partly designed from scratch.
Among the biggest changes we made was a pulsating reticle which completely transformed the way end users interact with BlinkID.
For one, they didn’t have to place a document within the dedicated frame anymore. Thanks to the advances in technology behind BlinkID, a user scanning an ID card no longer has to worry about the document placement, orientation, or angle.
Also, as BlinkID was able to recognize the document type itself, we removed the pre-screen prompting users to select the document before scanning it, saving them a few extra seconds of trouble.
The reticle displays on-screen instructions in real time, which means a user knows when to flip the document or move it farther away from the camera. This form of communication guides a user towards taking correct actions.
A quick flash, together with a checkmark at the end means the information has been extracted successfully. To unify the new experience across Android and iOS, we kept all of the changes consistent, bar some minor visual elements.
You can now see how it’s all about connecting the dots for the end user and stripping the interface down to its bare bones.
Check yourself before you wreck yourself: Usability testing
All of the above changes wouldn’t see the light of day without proper testing. We’re not all-knowing masterminds of UX design. That’s what usability testing is for. Nothing beats seeing real users interact with your design right in front of your eyes.
The whole point of usability testing is to find cheap fixes for otherwise expensive problems. There is simply no better way to weed out interface flaws and ensure a product meets user expectations.
Usually, the process involves users interacting with the design prototype, an early sample of the design. This is basically an array of images of the UI that are linked together to show a flow of the finished app.
But prototyping is impossible with BlinkID as it requires a user to interact with the camera in real time. Our developers had to devote plenty of time to make a fully functional app for testing purposes.
After a good deal of mutual effort, we had an app specifically built for testing purposes. Notebook, check. Pen, check. Questions, all thought of. We were ready to usher real users into the room and watch them interact with the new interface in-person.
Observing users in the wild
It was interesting to see how some, for example, held their documents in hand while others laid them flat on the table. Also, a number of users scanned their documents in landscape mode, reaffirming our decision to enable both orientations.
During this process, we discovered how scanning is a very personal experience for every user and that we need to consider it in every step of planning. Taking notes and gathering feedback from real users helped us significantly improve the UI and the overall user experience.
For example, we’ve learned that our on-screen messages disappeared before some users could even read them. That’s when we joined forces with our developers. After going through countless iterations, we finally found the right balance of timing and duration for feedback messages and animations. They were now easy to spot, read and understand without affecting the processing speed.
We can’t stress the importance of repeated usability testing enough. It assured us the changes we’ve made actually hit the right note with our end users. Without it, we’d essentially be designing for ourselves.
When all is said and done
After all this struggle, we’re left with a more unified UI, but also with a more unified team. Each and every one of us pulled our weight so that BlinkID could be enjoyed in all its scanning glory.
In doing so, we have learned a lot about design, development, and technological capabilities, as well as the importance of connecting the three for the end user. These discoveries are now forming a key part of our product design and development cycle.
So what does the future hold? Besides an occasional rainy day, plenty of good, no doubt about it. Mobile AR keeps evolving and new ways of engaging with the end users keep emerging. At Microblink, we’ll keep leveraging them to make our products simpler and more enjoyable to use.