Sramana Mitra: From your product roadmap point of view, you operate in the post-content production scenario mostly, right? You’re looking at content and governing content. You’re not involved in the production piece.
Volker Smid: We are also involved in the content production piece because we provide the side bar for the authoring environment. When an author creates a piece of content, we have a side bar that is connected to the enterprise so that you can create content following the rules of the enterprise. We are connected to the production piece as well.
Sramana Mitra: It’s not difficult, from a product point of view, for you to insert a language model that is trained on high-quality content that you have scored and governed.
Volker Smid: Correct.
Sramana Mitra: I take it that’s what you’ve done to your product.
Volker Smid: That’s exactly what we’ve done. Because of our DNA as the spin-off of the German Association of Artificial Intelligence, we have a tremendous amount of high-capability linguists in the company. Even before the existence of ChatGPT, we had people who can interact with large language models because that was our bread and butter. Using that capability and integrating the capabilities of OpenAI or other models into our platform is straightforward. Without the skills, it’s a long shot.
Sramana Mitra: What platform are you working with on the generative AI side?
Volker Smid: OpenAI. Within OpenAI, we experiment with two very different models. One is Curry and the other one is Da Vinci. This might change over time. We will probably see a large amount of open source models that might be of interest. We will see a large amount of specialized models that might be of interest to our customers.
Sramana Mitra: Do your customers have a say in what language model you incorporate into the generation part?
Volker Smid: That’s an unknown to be honest. We don’t know yet. Therefore, we have chosen two different models. One is considered to be high quality because the amount of components in the model is 178 billion. Curry has only 8 billion. The interesting part is, if you train a model like Curry with high-quality content, the output might be as good as the large models. We try to engage with customers to decide which model they want to use. Now we are making the selection and educating ourselves.
Sramana Mitra: How many of your 200 Global 2000 customers have started incorporating generative AI into their process?
Volker Smid: We are at a stage where we have a prototype. We have lined up about 22 customers to participate in a close beta. Based on the results, we will decide to open it up to a more public beta.
Sramana Mitra: What kind of compute requirement does adding this product need for these customers who will be adopting?
Volker Smid: There is no additional compute requirements. Tuning an existing model with existing content does not overload our system at all. Also using the result is fairly straightforward. The compute requirements are not vastly different to what we have today. Today, we have a significant compute environment, because if you want to check seven million content pieces constantly and score it, you need to have the backend to be able to do this.
Sramana Mitra: What compute infrastructure are you using?
Volker Smid: The preference is Amazon AWS. We have deployments on Azure and on Google Cloud.
Sramana Mitra: You are willing to deploy on any of those basically.
Volker Smid: Yes. It’s still interesting to see that there is still a preference to have our software on premise even in the world of cloud. I should say that about 20% to 30% of accounts still have a preference to run this on-premise.
Sramana Mitra: How do you do that with AWS?
Volker Smid: Kubernetes can be deployed in a local Kubernetes system.
Sramana Mitra: These companies have their own local infrastructure.
Volker Smid: Yes. The more advanced the cloud solutions, the more complex the Kubernetes environment becomes. At some point, it’s just too much.
This segment is part 5 in the series : Extending Product Roadmaps with Generative AI: Acrolinx CEO Volker Smid
1 2 3 4 5 6