357: (Free) Trial & Error

Download MP3
Arvid:

Hey. I'm Arvid, and this is the Bootstrap Founder. Today, I'll dive into the difference between a trial user and a trial abuser and what you can do to invite the former and prevent the latter. This episode is sponsored by paddle.com, and that's the payment provider I use in all of my SaaS properties. If you wanna build your business without having to even think about registering for sales tax in some country where you have, like, 2 customers, just use Paddle.

Arvid:

They're a merchant of record and they will take care of all of that for you. And their API also allows you to collect payment information before your users start their free trial. That's the topic of today. And as we learn year year again from the state of independent SaaS survey that Rob Walling and the MicroConf team analyze, that's a pretty solid strategy for subscription businesses. So check out Paddle and focus on what matters.

Arvid:

And here's what's been on my plate this week. As new users join PodScan and start their 10 day trial, I've been working on determining the right limitations to set for them. And I know this challenge is not unique for my business or to my business. It's something that every software company that offers a free trial struggles with at some point. Some of them struggle with them at all points, but let's just talk about this initial stage.

Arvid:

The goal is to strike a perfect balance here. You wanna show enough value to convert users while still protecting your business from excessive usage or outright abuse. And this balance has become particularly crucial in the age of AI powered SaaS. It used to be that for a SaaS business, you would just have your database and compute and that's what kinda cost you money. But now there's this kind of outside cost that you have very little control over and that can quickly explode.

Arvid:

AI features always come with associated costs and those can scale significantly if users find ways to abuse them or even if they just use them too much in legitimate ways. Take Peter Leavold's recent experience with photo AI. He shares this stuff all the time and his AI powered photo generation service has this kind of payment system that has credits involved and he's been fighting an ongoing battle with creative exploiters. That's what I would call these kind of people. Not criminals, not cheaters but creative exploiters.

Arvid:

And those people found a particularly clever loophole in what I think Stripe had implemented at that point they have since changed it at Peter's request, those people would upgrade their plan and then use all of their new credits that they would get and then downgrade the plan again to get a prorated refund on the money they paid, but keeping the already used credits. And then rinse and repeat. And this kind of exploitation can quickly drain resources and cost you a lot of money and that threatens the services sustainability. Because GPU compute is very, very expensive and you don't get your money back just because you didn't think of this potential exploit vector for your business. So that can very quickly drain your bank accounts and you don't want that.

Arvid:

And even benign and legitimate use can be problematic. I had to very quickly build a limitation system this week because several users signed up after I went on Greg Eisenberg's podcast to talk about AI business ideas. And it looks like there are a lot of crafty founders listening to that show and they sign up for PodScan, which I love. And a couple of them set up very general alerts for extremely common search terms. Like, think of words like, I don't know, videos or books or comments.

Arvid:

Right? Things that probably are mentioned on a lot of podcasts and they are quite literally mentioned mentioned on a lot of podcasts because I see them in my feed. And those users turned on the context aware filtering feature that is something that I offer for all plans including trial plans so people can see what it does and that one incurs an AI processing step for every matched mention. The idea is somebody's keyword gets mentioned and then I check if some other condition is also true by asking an AI a question about that thing. And this is still manageable even for trials but I did the math on this and it would cost me easily 100 of dollars just to run the step for all of the results that a trial user might have and that kind of felt over the top.

Arvid:

Right? There's something in a trial, the concept of a trial that should not incur 100 of dollars. So I kept trial AI processing to a limit that has kind of a reasonable hourly number which when that ceiling is then hit at some point informs the trial user through a UI feature that all they need to do to get the full experience is to actually sign up and pay. And when you offer something for free, you need to very carefully consider the balance between what you're willing to give away for free and what users need to see to evaluate your product properly. Because in essence a free trial is a form of customer acquisition cost, the fable CAC that everybody talks about.

Arvid:

It's not like an ad where the cost is paid the moment somebody clicks on a link and then lands on your product's website and takes it from there. That's a one time advertising expense. But with a trial your CAC continues to accumulate throughout the trial period until the user either converts, starts paying, or they leave. Right? That's a, I guess, a very negative cost there or a high cost because you spend all of this money on a user that did not even buy the product.

Arvid:

And understanding your trial as this ongoing cost makes it more manageable. I think it makes it more projectable too. You can do the math now. If you know your average customer pays $20 a month with a typical lifetime value of $200, it would not make sense to let them consume $500 worth of AI credits during that trial or even $50. That's a lot of money still.

Arvid:

You need to carefully consider your cost structure and then set limits accordingly for these things. And different services face different constraints obviously right every business is unique in the exact makeup of their at risk expenses. If you run a video platform you need to think maybe more about storage and bandwidth and maybe also this kind of trans coding time that you have. And if you run a a number crunching heavy app, you have to think about compute and maybe memory that that compute might incur. And for AI based software, you really have to look at the API usage for all those external GPU based services that you use.

Arvid:

And that's just one side of it. There's the other side. That's the size of the trial opportunity. You kind of have to factor that in too, because some customers might just try it out, see what it is, and then maybe buy it and they're always gonna be on the lowest tier. That's great.

Arvid:

And you have to set limit for those but others might be way more promising than those and their lifetime value, their LTV is significantly higher. So their acquisition cost, the money you spent on getting them to see the whole thing to figure out if it's for them might also need to be adjusted. And in my case with PodScan, I have the situation. I have to consider scenarios like large news agencies, right, TV broadcasting stations wanting to trial our platform and then analyze like all political podcasts out there for new developments or for interesting angles or avenues of conversation. While this could create significant server load if I were to allow it, it might be worth accommodating for this potential more enterprisey customer.

Arvid:

Which is why building in flexibility is crucial. You might even wanna think of pre scoring your customers for how likely they are to be one of those larger opportunities and have multiple trial limit groups inside your own trial system to accommodate for those different limits. To allow different limits for different kinds of people all of which are still on a trial. The key is identifying which actions in your business incur cost when repeated and then after figuring this out, you need to protect yourself from overuse. And still make the product usable and not worthless to users.

Arvid:

So at your limits still need to be reasonable so that people can see what it is. A good example here on how people usually do this are rate limits. And that's usually API based. Like, if you go to the YouTube API, the Google's API to check for videos and stuff, they have 10,000 requests per day. That's their limit.

Arvid:

And that allows you to do a lot of stuff, although they have a quota system. So it's a quota of 10,000, like, quota items a day and the search is like a 100 of those quota items. So you can do a 100 searches a day, which is not much, but it allows you to build a prototype. It allows you to build something that is really slowly gathering data for free without having to go into the more expensive tiers. Or if you have a photo uploading service and you probably use these all the time, but just uploading stuff to Instagram or whatever, you have a limit to how many images you can upload simultaneously and per hour too.

Arvid:

They kind of throttle this stuff at some point. So that's usually a way to do it. So for PodScan, I found several things that I always look into when it comes to setting up these limits and I'm just gonna share these implementation tips with you now. The first thing is to always set clear baseline limits that protect your business. Just think of it as the maximum of usage that a normal user would and should use these expensive features for.

Arvid:

Right? In my case, it would be every single day. How often do I want them to hit my internal GPU accelerated AI? Is it a 100 times? Is that already too much?

Arvid:

Is 10 times enough? It's like seeing 10 examples of what the thing can do is sufficient or does it maybe need to be a 1000 times? Alright. That's something you have to figure out what is the baseline. And then when you have that baseline, you need to implement the limit for that.

Arvid:

But on top of this, you need those limits to be configurable at the account level because this flexibility becomes crucial when working with these enterprise prospects. So even just this standout customer that you really want to see what your platform can do. And you need to be able to switch those limits either programmatically or by just going into the database and, like, changing a number, but you wanna be able to switch it without having to rewrite the code. So making them it's configurable as some kind of config object, a config number on the account or in the team whatever you have in your your org structure is very important from the start. And then when you have the setup, whenever a limit related event happens, like, a limit is reached or there's somebody is, about to reach a limit within the next hour or so, you need to log this kind of stuff meticulously.

Arvid:

Because hitting a limit or getting close to 1, that's a non happy path event both for you and your software because you set the limit for a reason and for the user because they might not be able to do what they wanted to do once they hit the limit. So this informs you about a usage pattern that you might not have foreseen or do not want. And then finally, I guess once you track how trial users interact with these limitations, you can act on it. This logging and tracking provides invaluable insights into user behavior. And I've seen trial users at PodScan set up the most unexpected integrations and unexpected settings for alerts that accidentally spam our servers or create massive database loads simply because they are just trying to figure things out and haven't succeeded yet.

Arvid:

These moments when people are just figuring stuff out and they're kind of confused and they didn't really get it. These moments create perfect opportunities for meaningful customer interactions. For instance, when I notice that a trial user repeatedly is hitting certain limits, I reach out to them with a message like, hey, I see you've set up an alert for this very generic keyword and this causes a notification every couple seconds is probably not what you want and this is not the result that you might be looking for. But here's a more effective approach. How about these 3 specific keywords that I thought about for you and this use case?

Arvid:

And this not only helps the user succeed, it's kind of customer success in this moment, but it also gives me crucial insight into the job that they're trying to accomplish because I can ask them why did you set up this very very common keyword and then they'd usually tell me well I want to find all podcasts that talk about this. Then I'm like, yeah, but if you find a more specific keyword and then you make this kind of context aware filter look into the broader scheme of things, you might get better higher quality data. And then usually they're like, oh, okay. And then we set it up and then it's a much more reliable thing that cost me way less. And these user interactions have led to numerous improvements both in our onboarding documentation and the clarity of the UI and how I communicate the steps, example workflows that I give to people, all of this.

Arvid:

Sometimes, they even reveal potential enterprise customers among the trial users who need custom limits for their use case. I talk to people and they're like, okay, yeah, I'm trialing this for our 4,000 people company and I wanna see what this can do. Now okay, let's let's see what we can do for you. And at other times this helps me identify users who might not align with our business model. When people wanna do something that PodScan is just not built for and I'm saying, well, we could do it, but it's really not what we wanna do here.

Arvid:

So here's how you would do it, but okay, maybe this is not for you. So don't feel pressured to implement every feature or do anything that people want from you if they're trial user because trial users still have not paid, right. The commitment is not there. Recently I had a trial user demand quite significant processing capability on their account to test something out without making any commitment to even our essential plan after I asked them to. They wanted to see what the system can do, but not pay for it.

Arvid:

And I guess that's just the expectation for anything you're trying to sell, but it might feel counterintuitive here because it's leaving potential future money on the table. If you could convince them they will probably pay, but saying no to such requests is often the right move for your businesses long term health. Because if you don't do this it allows people to select into your customer group that you don't really want to have in there. What you want is people self selecting out of your customer group if they're not a fit and self selecting into your customer group if they are a fit. You don't want to enable people who should be out of this group to come into your group.

Arvid:

That's really the idea. And remember that the purpose of a free trial is not to provide unlimited free access to your product. That's a misconception that I often had myself in all these software products. I wanted to give this for free so that people could use it and love it and then maybe pay for it. That's not the idea.

Arvid:

A trial's purpose is to demonstrate value quickly, right, upfront and then allow users to make an informed decision about committing to your product. And that's the point. The point is to give them something to make a decision to then pay and not more than that. So while trials should showcase your software's full capabilities, when it's possible and feasible, some features in there might not be demonstrated through a trial. You might actually need to do this through case studies or demos or videos instead when it comes to, like, expensive customizations or integrations with services that you pay for.

Arvid:

That stuff shouldn't be in your trial. Now even though you wanna show as much as you can, sometimes you just need to show it. You can't really give it to people for free. So set clear limits, make them configurable, and then carefully monitor usage patterns, and you will end up with a trial system that effectively demonstrates your products value without risking your business financial health. The goal is really to front load value as much as possible while still maintaining sustainable operations in your business in the back end.

Arvid:

And sometimes that just means saying no to users who don't align with your business model. Or at the very least, you say not too much. Don't use it too much. Use it. Check it out.

Arvid:

But don't overstay your welcome here because in the end, these are still prospects and not paying users. And that's it for today. Thank you so much for listening to the roots of founder. You can find me on Twitter at Avidkar, a r v I d k a h l, and you will find my books on Twitter, Khossthat too. If you wanna support me and this show, please tell everybody you know about podscan.fm if they might be using it, they might wanna use it, and leave a rating and a review by going to ratethispodcast.com/founder.

Arvid:

Makes a massive difference if you show up there because then the podcast will show up in other people's feeds. Any of this will help the show. So thank you so much for listening. Have a wonderful day, and bye bye.

Creators and Guests

Arvid Kahl
Host
Arvid Kahl
Empowering founders with kindness. Building in Public. Sold my SaaS FeedbackPanda for life-changing $ in 2019, now sharing my journey & what I learned.
357: (Free) Trial & Error
Broadcast by