AI Is The New UI - AI + UX + Designops
Amit's expertise includes Microservices, SOA, embedded and real-time systems, distributed architectures, iterative development using various development and EAI methodologies. He leads numerous large development efforts & multiple teams at Avanade.
Today you cannot read any technology or business article without some discussion around Artificial Intelligence (AI), whether explicit or implicit. We see AI integrating deeper in all elements of our life-be it at home with our families, or at work with our colleagues and clients. Some of these you can feel and `see'; and with others you experience the output or outcome.
We don't see AI as the singing and dancing silver bullet. Instead we think about it as `ambient AI'like a sprinkling of magic that augments various things around us with pixie dust. Of course, this pixie dust is actually the various APIs (cognitive and otherwise), powered by data.
Fundamental concepts of a user, an interface, design and experience are changing. Powered by the right mix of AI and associated cognitive APIs, we are seeing the rise of new user interfaces and experiences with conversational UIs and intelligent assistants such as Microsoft's Cortana and Amazon's Alexa. And, in parallel, mobile apps are declining or at least reaching a plateau.
These new AI-driven interfaces are leading to deeper and more meaningful interactions, which are tailored to me as an individual and to my unique situation. We call this `situational centricity' which is distinctly human, mostly invisible and seemingly magical. This situational centricity will move from one medium to the next, enriching, expanding, contracting and adapting as needs change.
Emergence of a Screenless World
Today, as we continue to see the convergence of cloud, big data and mobile, the degree of disruption is increasingnot only the pace, but the reach and scale. The result is the rise of a screenless world.
Now this doesn't literally mean that there isn't a screen; rather, it is the expansion from screens to other mediums voice, touch, haptics and other new interactions that were not previously possible. And these inter-actions are seamless, frictionless, fluid, and span across different environments.
Enterprises need to think of these immersive experiences (powered by situational centricity) as interactions and not as interfaces. For businesses, this can be a challenge and might require rethinking many aspects of the organizationmore than just the interaction layer, but all the way back to their core ERP, CRM
Today you cannot read any technology or business article without some discussion around Artificial Intelligence (AI), whether explicit or implicit. We see AI integrating deeper in all elements of our life-be it at home with our families, or at work with our colleagues and clients. Some of these you can feel and `see'; and with others you experience the output or outcome.
We don't see AI as the singing and dancing silver bullet. Instead we think about it as `ambient AI'like a sprinkling of magic that augments various things around us with pixie dust. Of course, this pixie dust is actually the various APIs (cognitive and otherwise), powered by data.
Fundamental concepts of a user, an interface, design and experience are changing. Powered by the right mix of AI and associated cognitive APIs, we are seeing the rise of new user interfaces and experiences with conversational UIs and intelligent assistants such as Microsoft's Cortana and Amazon's Alexa. And, in parallel, mobile apps are declining or at least reaching a plateau.
These new AI-driven interfaces are leading to deeper and more meaningful interactions, which are tailored to me as an individual and to my unique situation. We call this `situational centricity' which is distinctly human, mostly invisible and seemingly magical. This situational centricity will move from one medium to the next, enriching, expanding, contracting and adapting as needs change.
Emergence of a Screenless World
Today, as we continue to see the convergence of cloud, big data and mobile, the degree of disruption is increasingnot only the pace, but the reach and scale. The result is the rise of a screenless world.
Now this doesn't literally mean that there isn't a screen; rather, it is the expansion from screens to other mediums voice, touch, haptics and other new interactions that were not previously possible. And these inter-actions are seamless, frictionless, fluid, and span across different environments.
Enterprises need to think of these immersive experiences (powered by situational centricity) as interactions and not as interfaces. For businesses, this can be a challenge and might require rethinking many aspects of the organizationmore than just the interaction layer, but all the way back to their core ERP, CRM
and other line-of-business enterprise systems.
For example, I could be on a traditional desktop computer having an interaction on a screen but then shift to a mixed-reality interaction with holograms on a HoloLens device that not only incorporates vision, but also spatial sound and awareness. And then I might walk over to my car and switch to a voice-only interaction on a mobile but then change to a different screen with different capabilities when I get in my car. These experiences span across a number of intelligent automation services, from vision, speech, language and knowledge.
How to Make the Experiences Happen
To translate `screenless' experiences, and to tap into the situational centricity, enterprises need to experiment with new business models that help re-engineer outcomes. Those who adapt and change not only the experience, but business models and workflows, to exploit and create a new revenue model will enjoy success.
In recent years we have witnessed the rise of two similar, yet distinct practices across the enterprise:
• Modern engineering methods, such as agile and DevOps, have completely rewired our approach to soft-ware development, product lifecycles, and the speed we go to market with new products, services and solutions.
• Design thinking is helping businesses adopt a deeply human-centric viewpoint, redefining why and what they should be building in the first place.
It's time for enterprises to start to combine design thinking principles with their modern engineering teams and projects. Think of it as design-driven engineering, or what we call `DesignOps'.
Organizations will need to build up a culture, mind-set and business model ready for a DesignOps revolution where everyone is focused on the user and value. Any product, service or solution not created with the deeply human-centric, situation-centric insights of design thinking will not last long.
And as enterprises look to adopt this new DesignOps approach, they need to go back to the ambient AI concept (some call this `light AI') and find the right places and scenarios where it can be used. This should be bro-ken down in bite-size chunks; looking at the right set of automation areas to increase productivity. This approach goes to the heart of DesignOps. Experimenting with APIs and fluid experiences that span intelligent automation, Robotic Process Automation (RPA) and physical automation can really make a difference.
With the democratization of AI, enterprises don't need to setup research labs or have deep pockets to tap into the AI market; they can build on top of the industry giants. Broadly speaking we see a progression for enterprises as their APIs and algorithms increase. This progression starts from Consumer API to Developer API to Enterprise API.
For example, consumers are using Amazon Echo's Alexa at home, which uses Bing APIs for search and various cognitive APIs for speech and voice. As usage of Alexa increases, many developers will start building `apps' for Alexa. In some cases they may not be adding too much value (remember the early days of the Apple app store?). Nevertheless, this in turn is piquing the interest of enterprises.
In summary, enterprises that embrace AI as the new UI and experience layer, powered by situational centricity, and take a DesignOps approach will be ready for the AI world.
For example, I could be on a traditional desktop computer having an interaction on a screen but then shift to a mixed-reality interaction with holograms on a HoloLens device that not only incorporates vision, but also spatial sound and awareness. And then I might walk over to my car and switch to a voice-only interaction on a mobile but then change to a different screen with different capabilities when I get in my car. These experiences span across a number of intelligent automation services, from vision, speech, language and knowledge.
This kind of experience is only possible because of the various AI cognitive APIs. Some are vision APIs that sense the environment around us to help interpret intent; others are speech and language APIs that help comprehend and understand audio, or knowledge and graph that help connect and interpret intent and `knowledge' around us.Powered by the right mix of AI and associated cognitive APIs, we are seeing the rise of new user interfaces and experiences with conversational UIs
How to Make the Experiences Happen
To translate `screenless' experiences, and to tap into the situational centricity, enterprises need to experiment with new business models that help re-engineer outcomes. Those who adapt and change not only the experience, but business models and workflows, to exploit and create a new revenue model will enjoy success.
In recent years we have witnessed the rise of two similar, yet distinct practices across the enterprise:
• Modern engineering methods, such as agile and DevOps, have completely rewired our approach to soft-ware development, product lifecycles, and the speed we go to market with new products, services and solutions.
• Design thinking is helping businesses adopt a deeply human-centric viewpoint, redefining why and what they should be building in the first place.
It's time for enterprises to start to combine design thinking principles with their modern engineering teams and projects. Think of it as design-driven engineering, or what we call `DesignOps'.
Organizations will need to build up a culture, mind-set and business model ready for a DesignOps revolution where everyone is focused on the user and value. Any product, service or solution not created with the deeply human-centric, situation-centric insights of design thinking will not last long.
And as enterprises look to adopt this new DesignOps approach, they need to go back to the ambient AI concept (some call this `light AI') and find the right places and scenarios where it can be used. This should be bro-ken down in bite-size chunks; looking at the right set of automation areas to increase productivity. This approach goes to the heart of DesignOps. Experimenting with APIs and fluid experiences that span intelligent automation, Robotic Process Automation (RPA) and physical automation can really make a difference.
With the democratization of AI, enterprises don't need to setup research labs or have deep pockets to tap into the AI market; they can build on top of the industry giants. Broadly speaking we see a progression for enterprises as their APIs and algorithms increase. This progression starts from Consumer API to Developer API to Enterprise API.
For example, consumers are using Amazon Echo's Alexa at home, which uses Bing APIs for search and various cognitive APIs for speech and voice. As usage of Alexa increases, many developers will start building `apps' for Alexa. In some cases they may not be adding too much value (remember the early days of the Apple app store?). Nevertheless, this in turn is piquing the interest of enterprises.
In summary, enterprises that embrace AI as the new UI and experience layer, powered by situational centricity, and take a DesignOps approach will be ready for the AI world.