AI and future user experience
Artificial intelligence is nearing genuine utility.
Exhibit 1 – a chess programme learned to play at International Master standard in 4 days. It did this not through brute force minimax (effective but not really intelligent) but via neural networks and self-correction over time. (This was the exact topic of my Masters’ dissertation – endless boring enthusiasm available on request.)
Exhibit 2 – OS giants are positioning predictive AI as central within their value propositions, viz. Google Now On Tap and iOS9’s Proactive Siri. They’re immature, but are clearly aiming to become connective tissue, bending to context and learning from rich user data. On wearables these agents become even more central, since physical input is constrained and context is richer.
Exhibit 3 - I mentored at Seedcamp last week, and heard the phrase “machine learning” echoed in the majority of pitches. AI was the secret sauce, the differentiator. Now, these are early startups with a ton of thorny execution ahead of them – but it seems AI isn’t just for the big guns now.
AI is becoming a cornerstone of user experience. This is going to be interesting (read: difficult) for designers.
1. No longer will products be fully deterministic. We won’t be able to enumerate every page, every response. Instead we’ll have to create frameworks / scaffolds / templates for AIs to deliver output through. These scaffolds may be sonic, tactile, and linguistic as well as visual.
2. The front-end engineer will no longer be the dominant manufacturer of user experience. Designers have become competent at working with front-enders to ensure UI quality, but now we’ll have to understand and partner with data scientists and deep back-end engineers too. Some stats knowledge and even some AI knowledge will probably be useful.
The role broadens once more.
[Inspired in part by recent conversations with Giles Colborne and Jai Mitchell.]