Call and Response UI/UX design (aka chat bot thought)
Chat has grabbed everyone’s attention and it seems like the bots are here to stay.
I waiver between chat is an interesting path toward the future of human/computer interaction design and chat is lazy UI/UX. The real magic of chat is the promise of much higher level, less granular commands/input but the same level of detailed/granular output — requiring systems that support humans interacting live/in real time at web scale or AI.
The call and response nature of these interactions creates a broad new set of UI/UX design challenges that are unique to chat and dependent on the capacity of the back end system to understand the call and generate the response. Before we get to full conversational AI, interaction designers will need to find ways to cheat — and to constrain the problem set so that the user actually issues a command to the system but has the perception of conversation, the magic of being understood and avoids the burden of remembering specific commands.
A bridge to AI might be command lines masquerading as AI driven natural language interfaces — like telling amazon to order more paper towels or texting uber to, “Take me home” — where the system has specific knowledge about you and your preferences that can create a script that runs when you issue the command. NLP can likely map lots of command variants (order more paper towels, I am out of paper towels etc) to the appropriate/intended script, my challenge with this bridge is it doesn’t cover enough use cases for any one system to become the primary interface — so it is easier to push the button and get the car and click the destination suggestion of “home” than it is to remember that I can chat with uber to “take me home” but not to take me from one meeting to another.
The design of the interaction has to carry the user from one understandable command in the system to another and push them further and further down known conversation trees — without letting them feel the rails.
Most of the activity that I have seen in the space takes the pressure off the designers (and the AI) by focusing on supporting humans to scale and handle all use cases from the start and hope that the AI can catch up/fill in over time. I wonder if this will ever lead to human/AI conversations or if, like the self driving cars, human assist is fundamentally different than fully autonomous.