
Reporting Identity in GA4
Learn more about Reporting Identity in GA4 and how you can use it.
You’ve spent hours, maybe even days, crafting what you thought was the perfect blog post. You hit publish, and… nothing. A trickle of traffic, maybe, but certainly not the flood you were hoping for.
It’s a common frustration for marketing managers and agencies alike: content that just sits there, not pulling its weight.
But what if you could systematically identify these underperforming articles and get a clear, data-driven roadmap on how to improve them?
Here’s the thing: you can. By combining the power of Screaming Frog with data from Google Analytics 4, Search Console, and the AlsoAsked API, you can build a powerful content audit machine. This isn’t about guesswork; it’s about using the tools you already have to make smart, strategic decisions.
First things first, we need to get Screaming Frog talking to Google. The goal here is to pull traffic and impression data directly into our crawl, giving us a single view of every URL’s performance.
In your Screaming Frog configuration, you’ll need to connect to two key APIs:
Google Analytics 4: Navigate to Configuration > API Access > Google Analytics. Authenticate your account and select the correct property and data stream. The most crucial setting here is the date range. Extend it to the past 12 months to get a meaningful amount of data. You can keep the default metrics, but for this process, the main one we care about is Views. If you’re not an e-commerce site, feel free to deselect metrics like Total Revenue to keep your crawl clean.
Google Search Console: Next, go to Configuration > API Access > Google Search Console. Again, connect your account and, just like with GA4, set the date range to the past 12 months. This ensures we’re comparing apples with apples (on at least a timescale perspective). It’s entirely up to you what metrics you include, but for this I would go for metrics. You could of course, choose different metrics to suit your mood. Perhaps revenue by landing page, ooooooh.
Once that’s set up, you’re ready to run your crawl on your chosen site. To keep things focused, you could even limit the crawl to a specific subdirectory, like /blog/.
Screaming Frog has excellent guides on the above which you can read here.
With the crawl complete, you’ll have a spreadsheet full of URLs, each enriched with GA4 views and GSC impression data.
Now, we need to sift through this to find the pages that need attention.
This is where you get to play analyst. You need to define what “underperforming” actually means for you. There’s no single magic formula, but a great starting point is to create a set of conditions.
Export your crawl data to Google Sheets or Excel and add a new column called “Status”. Then, create a formula to flag pages for review. For example, your logic might be:
Review this page IF:
It was published less than 6 months ago (giving it some time to mature), AND
It has fewer than 500 sessions from GA4, AND
It has fewer than 1,000 impressions from Google Search Console.
If a URL meets all these conditions, flag it as “Review”. If not, mark it as “OK”.
This simple filtering exercise will give you a clean, actionable list of URLs that are prime candidates for a content refresh.
For Reference, my sheets formula is this:
=IF(AND(EDATE(B7, 1) < TODAY(), AV7 < 500, AZ7 < 1000), “Review”, “Do Not Review”)
Right, you’ve got your list of underachievers.
The next question is, why are they underperforming? Often, it’s because they aren’t fully answering the breadth of questions your audience is asking on the topic.
This is where we go back to Screaming Frog, but with a clever twist.
Switch to List Mode: Instead of crawling a whole site, change Screaming Frog’s mode from ‘Spider’ to ‘List’.
Upload Your URLs: Paste the list of “Review” URLs you identified in the previous step.
Configure Custom Extraction: This is the clever part. Navigate to Configuration > Custom > Extraction. You’ll use a pre-built extractor that connects to the AlsoAsked and OpenAI APIs. This will find all the “People Also Ask” questions related to your target keyword for each URL.
You’ll need to add your API keys for AlsoAsked and OpenAI. Once configured, run the crawl.
By using the AlsoAsked API, you can find opportunities to complement and add in additional content to help bring the piece up to a position in the Google search where you expect it to be.
For a more detailed walkthrough on setting this up, you might want to read this guide from Mark Williams-Cook (I mean, he made AlsoAsked, so he probably knows what he’s talking about).
When the list crawl finishes, the custom extraction column will contain a goldmine of information: a suite of relevant, unanswered questions for each of your underperforming articles.
Your job is now simple:
Go through the list of questions for each article.
Weave the answers naturally into your existing content.
This might mean adding a new H2 section, creating a dedicated FAQ at the end of the post, or expanding on an existing point.
By doing this, you’re not just adding more words; you’re directly addressing user intent, increasing the topical authority of your page, and giving Google exactly what it wants to see.
You’re turning a piece that wasn’t hitting the mark into a comprehensive resource that deserves to rank.
Of course, this shouldn’t be done in silo – it should be done along with recommendations for improving the ranking of a page.
Feeling overwhelmed by content that doesn’t work is a thing of the past. With this process, you have a repeatable system for auditing and improving your blog’s performance.
To recap:
Connect GA4 and GSC to Screaming Frog with a 12-month lookback.
Crawl your site and export the data.
Filter your URLs based on performance thresholds to create a “review” list.
Re-crawl that list using custom extraction to find content gaps via AlsoAsked.
Update your articles with the new questions and answers.
Q: Why use a 12-month date range?
A 12-month window gives you enough data to account for seasonality and avoids making decisions based on short-term traffic blips. It provides a much more stable and reliable view of a page’s true performance over time.
Q: Can I do this without a paid AlsoAsked API key?
While the API makes this process scalable, you could perform the core task manually. For each URL on your “review” list, you could manually search its primary keyword on Google and note down the questions in the “People Also Ask” box. Or of course, use the main interface of AlsoAsked. It’s more time-consuming but achieves a similar result for a smaller batch of URLs.
Q: How often should I run this content audit? A: A good cadence is to run this audit quarterly. This gives new content enough time to gain traction (or not) and allows you to stay on top of any performance declines in older content before they become a major issue.

Learn more about Reporting Identity in GA4 and how you can use it.

Author
Hello, I'm Kyle Rushton McGregor!
I’m an experienced GA4 Specialist with a demonstrated history of working with Google Tag Manager and Looker Studio. I’m an international speaker who has trained 1000s of people on all things analytics.