Cohort keyword research is a process I’ve created and refined over the past ~2 years as an off-shoot of FTF’s larger keyword research offering; identifying a market’s total addressable market online.
What is Cohort Analysis?
If you’re unfamiliar with cohort analysis, wikipedia defines it as:
Cohort analysis is a subset of behavioral analytics that takes the data from a given dataset (e.g. an eCommerce platform, web application, or online game) and rather than looking at all users as one unit, it breaks them into related groups for analysis.
These related groups, or cohorts, usually share common characteristics or experiences within a defined time-span. Cohort analysis allows a company to “see patterns clearly across the life-cycle of a customer (or user), rather than slicing across all customers blindly without accounting for the natural cycle that a customer undergoes.”
An easier example to understand would be thinking back to your high school class, while it’s likely there have been thousands of people who have graduated from your high school, one specific cohort would be the group of people you graduated with — for me this is the class of 2002 from Valley Forge Military Academy.
You can go deeper down this rabbit hole as well, for example additional cohorts (that I would be a part of) are the 2002 VFMA Calculus 2 class, or the 2000 VFMA Varsity Rifle Team, and so on.
The nuance here within SEO is applying the approach to sets of keyword data to identify patterns between cohorts that exist based on keyword-level metrics like search volume, cost per click, and organic difficulty.
The Keyword-Level Data
For this process I am currently exclusively using data from Ahrefs.
As of the time of publication, their keyword difficulty rating is my favorite across all the SEO tools out there.
(If you’d like to see how Ahrefs stacks up next to Semrush, check out my post on Semrush vs. Ahrefs).
The cohorts my process looks at are:
- Competition (difficulty)
- Cost per click
- Monthly search volume
For each metric, I’m going to use data from 5 websites in the same niche, and create cohorts for the average maximums and minimums of each metric, using the mean for each as the middle of the road to sort all other term-level metrics into an “above” or “below” bucket.
This is useful because it allows you to combine these metrics to identify your best opportunities, i.e. keywords with higher than average volume and cost, but lower than average difficulty.
How To Use This Sheet
It allows us to identify opportunities based on combining data from each threshold group and then applying it across the entire keyword population.
So pulling the top 1,000 keyword rankings for each site (for these top 1,000 I’m using the highest ranking terms (i.e. sorted by lowest ranking position; so from position 1+) in ascending order, but you could sort terms by MSV, Difficulty, or whatever your heart desires), I’m able to set a heuristic baseline for each site saying “show me all terms that are“:
- Under the total average difficulty across all sites
- Under the total average CPC across all sites
- Above the total average MSV across all sites
A Video Walkthrough of the Process
To help run through this entire process, which might also help you identify potential keyword cannibalization issues, I created a Google Sheet. It’s set up with a dashboard view and “keyword environment” sheet that dynamically shows the individual term lists and all correlating data based on your selected cohort query, and is built to support up to 100,000 rows (keywords) for 5 websites.
Get Access to The Google Sheet Template
This template is just one of dozens we’ve built for Traffic Think Tank members.
Sign up for traffic think tank now and get instant access to this sheet, over 300 hours of exclusive content, and some of the most useful discussion threads in the SEO industry.
Get instant access to the cohort keyword template
(and over 300 hours of exclusive SEO process content and templates)