Chat interfaces can be traced back to as early as the 1960s, particularly the world's first chatbot developed by Joseph Wiezenbaum. Given the technological constraints of that time period, it surprisingly mimicked most of today's chat interfaces; you typed a prompt, and it printed a block of text in return.
Fast forward to the 2020s, platforms like Meta's BlenderBot continued that paradigm. You typed a prompt, it returned a block of text. Given that most of these platforms were intended as research experiments, not a lot of thought went into how people would interact with this new medium.
ChatGPT, launched in 2022, was released on a whim when executives decided it needed to be out in the open to avoid being upstaged by potential competitors like Anthropic. Again, not a lot of thought went into how people would interact with this new medium.
Today, most, if not all, products you use will have chat interfaces. Some disguised as FABs, some disguised as 'summaries', and in other cases flat out input fields. The problem with this; when it's presented as the primary way of interacting with a product, is that it takes away intent. You're given a blank state and asked to tell the software that's meant to help you how to help you.
I see AI as a great contributor to interlinking a user's mental model within your software. The question becomes: How do you leverage AI's deep user context to help people achieve their goals, within your product or service, better, quicker, and with greater intent?


