How We Improved Kafkai and our Customer Churn
When we started in April 2022, we had a few customers who tried out Kafkai but stopped using us and left. It was (and still is) important for us to address this churn. It took some time for us to get the data, pinpoint the issue, and finally design and roll out a fix.
Today we're seeing improvements that we expected to happen. So I'm writing this blog post in part to remind ourselves of what we did and why, but also to our customers who expect nothing less from us.
When a customer unsubscribes from Kafkai, we have a short form to ask them why are they leaving. They can choose one answer, which can be "Quality", "Pricing", "Length" or "Other".
Quality of generated content was an issue
Within the middle of last year, we knew two things that we need to address, based on this feedback. It was "Quality", which had taken more than 42% of all responses, while "Pricing" was a close 35%.
We set out to tackle the "Quality" issue first, by training our models with more and better data and adding more functionalities within the generation pipeline - which was invisible to our customer - to improve the output of our next generation. This took time, but we managed to roll out the fixes in the middle of December 2022.
Next is pricing
Since then, we've looked back at the responses data when our customers decide to leave and are pleased to report that "Pricing" now consists of 50% of all responses while "Quality" is now only 19%. Customers are now mostly leaving because of the price, instead of the quality of articles.
We're excited and relieved that our efforts to improve our customer experience have shown tangible results, and the way forward is clear to us.
To be frank, fixing the pricing is an easier thing to do. Based on the conversations that we have with our customers, we think we understand why they would want to use Kafkai instead of other AI text content generators. So all we have to do is to put a fair price that delivers this particular value to our customers that needs them most.
Which we are already working on!
So stay tuned for more updates on this space.