Over the last five years, and almost certainly for the next five, tremendous progress in computing and AI research has made intelligence accessible and scalable. This progress is transforming how products, and their value, are best presented to users. Instead of building products that attract and occupy users' work or play, the best model-driven products will focus on getting as close to where they can add the most value and remove any layers.
"I know it when I can't see it."
The software that has eaten much of modern business and that we use to communicate has, naturally, been built around humans and the information they create or that is created about them. Whether you are building payroll software or an ephemeral image sharing, common wisdom has been to craft and curate the experience your users have within your product when they access that information. Central to this has been a focus on excellent design that enables users to do what they want in a way that is so easy that they keep coming back or paying. This experience has in most cases, revolved around a database filled with your information and that of other humans, and an architecture to use that information to some end. Presenting your product as the visible caretaker and curator of that human to human information was critical; therefore, building your own environment with the unique and memorable design was an effective strategy.
Now, products that embrace a trifecta: human, AI model and the problem or task immediately before them are emerging. The product is no longer a guardian and distributor of information but an input towards the problem. In some ways, consumer products that are built natively with large models might look more like developer tools then the breakout consumer products of the last twenty years. Successful products can fade into the background. The human only notices where they are NOT present, which is a strong customer relationship.
Apple vs. Apple
A great way to explore the dichotomy of visibility is to look at two products almost anyone reading this has used. On each iPhone, users have the native Notes app where they can jot down information and edit free-form text. Equally present, if invisible, on each iPhone is the keyboard autocorrect engine.
The first, Notes, is elegantly designed and an extremely reliable tool for creating, saving, and accessing notes on the go. It strives to provide an experience that welcomes you back and lets you accomplish more on your day. The keyboard autocorrects, on the other hand, is invisible. You almost certainly don't notice how often it tweaks and corrects what you are typing, especially if you have fat fingers, as I do. This tool is, in many ways, much more essential to the user experience of all apps, even if invisible. If you were to disable it, you would know exactly how powerful it is.
Model-driven and model-native products will likely look much more like the latter. Design is in when not to appear when to interject, and how they can accelerate whatever the task is.
A new product design
From the beginning, one of the most important visions for PromptLoop is to bring the fruits of this progress as closely as possible to the problems it can thrive in solving. The simplicity of the end product, a single formula embedded in an existing product, emerges from this belief that minimizing distance and friction to using this technology is as important as the product itself. Being able to deliver powerful model results in a lightweight package, like Apple's autocorrect engine or Cortana for Master Chief in the Halo video game, is a decisive advantage. s