First, the right way: make sure you have data on prospect behaviors or demographic traits, as well as conversion data. Run analyses to determine whether prospects with a behavior/action or demographic trait converted at a higher rate than those who didn’t. Easy. (Not really, but not as scary as you might think)
Now, the wrong way, maybe it’s the way that your boss told you to do it or maybe it’s something you picked up from Marketo’s or Eloqua’s forums:
Step 1: Identify the actions that your prospects can take (e.g. visit your website or particular sections of your site, open an email, or attend a webinar)
Step 2: Assign a number of points to each action. Probably mostly arbitrary numbers.
Step 3: Give prospects some arbitrarily defined points each time they perform one of your arbitrarily chosen actions.
Step 4: Make further arbitrary distinctions by declaring a lead good and ready for sales when it has earned an arbitrary number of points.
Step 5: Pat yourself on the back and go to happy hour.
Sadly, this is how most places seem to do lead scoring, and it’s a terrible way to do things. Full disclosure: while I pride myself on generally not doing terrible things, I’ve fallen into the trap of doing lead scoring this way in the past, so don’t beat yourself up over it. Even if you did things the wrong way, there’s probably still a way to make use of all that work in Steps 1 through 4. And when you do it right, you can still pat yourself on the back and go to happy hour (Step 5). All you have to do is make sure you have all this prospect data stored somewhere, and that you can link those prospects to conversions (however your org defines them).
Data You Will Need To Do This The Right Way
- Prospect actions and dates (preferably with times) that they took each action
- Date (preferably with time) that the prospect was sent over to Sales
- Date that the prospected converted (if it did at all)
Once you have that database of prospect actions and conversions ready to go, run some simple analyses. Did prospects who attended webinars prior to getting sent over to Sales convert at a higher rate than those who didn’t? Did prospects who visited the website more frequently convert a higher rate? Did prospects who got to particular pages convert at a higher rate than those didn’t visit those pages?
Some may think well OBVIOUSLY those segments converted at a higher rate. However, that’s not necessarily the case. In my own experience at PitchBook, some of the signals that marketers might think of as being signs of great leads actually weren’t. Some examples:
1. More Visits / Pageviews → Higher Interest? → Higher conversion?
FALSE! It turned out that some people were just visiting the site frequently because they loved the content from PitchBook’s outstanding free newsletter. They liked reading our news but had no need of our product.
And over the years, these prospects just kept visiting more and more and reading our free articles, racking up more and more points, but honestly never getting any more interested in our product.
What we ended up finding was that prospects who had fewer visits (but not zero) actually converted at a higher rate than those who had more visits.
2. Visiting Key Pages/Product Information Pages
FALSE! Sort of. At one point in time we found that leads who visited our “product” pages rather than our News pages with our free articles about the VC and Private Equity industry did convert at a higher rate than those who had only been to News pages. So that follows the intuitive line of thinking, however, over time, it changed: some of the Product page visitors converted at a worse rate.
“Some” visitors, because the prospects who filled out a lead form asking to get contacted of course continued to convert at a high rate. It was the remaining prospects–the ones who hadn’t filled out a form but visited the key pages–who were now converting at a lower rate.
Why? It’s hard to say for sure, but our guess is that over time we became better at getting visitors to fill out forms. As our conversion rate optimization efforts improved, anyone who might have been even a little bit interested in our product became more likely to fill out a form. The remaining visitors (note: if someone filled out a form, they got sent to Sales, we were left figuring out what to do with the ones who hadn’t) that we could draw on to send to Sales weren’t interested enough to fill out a form even though we’d made it easier for them, and so it followed they made for a bad lead when we tried to send them over. It was sort of like a negative survivor bias: prior to our conversion rate optimization efforts, more of the borderline interested prospects didn’t fill out a form, so our pool of potential leads still contained a decent number who might convert, but after improving our site, our remaining set of prospects to deliver “survived” our attempts to get them to fill out a form to become an inbound lead because they frankly just didn’t find our offer very compelling.
Of course, you may see something different. It could be that conventional wisdom works in your case, though that may just be because you’re not only bad at lead scoring and marketing analytics, but also bad at conversion rate optimization. But if you are better than that or if you want to become better than that, if you care to find the truth and if you care as much about making the best leads flow to your sales team as I do, find the time to do these analyses and run these tests. Your Sales team might even thank you for sending them slightly less garbage-y leads.