How to Automate Revenue Optimization to Boost Organic Traffic ROI (Part 1)

Building organic traffic is nice, but ROI is what really moves the needle. Same for SEO as for any other marketing channel. This guide will teach you how to automate SEO revenue optimization to boost your organic traffic ROI.

How I automated this process?

I coded a cool Python script, that fetches data from different sources, blends them all together and output to a simple Excel file with few tabs, based on the various segmentation and data points.

What data I use?

  1. Tableau – for Visits, Revenue, EPV (Earning per Visit), CR (Conversion Rate), CTR (Click Through Rate) and Unique Clicks

  2. Log files – for Google bot hits (for 200 ok pages)

  3. Google Search Console – for average position per page

What I’m trying to achieve?

My goal is to be able to easily identify revenue opportunities in the data, and provide prioritized list of action items, focused on how we can leverage existing content to generate more revenues from the organic channel.

Additionally, this tool enables my coworkers, who are not familiar with coding or automation, to get the data in a simple format, which can then be used for an efficient data analysis.

What was my main challenge?

One of the main challenges I faced was to analyze Tableau data. When export to a CSV file, Tableau data looks like this:

Tableau Data
Original Tableau Data

For each page, you’ll get the main KPIs (Measure Names), row by row.

Now, the screenshot is filtered for only one page for demonstration purposes, but clear the filter, and multiply 7-9 metrics rows by the number of pages on your site – and you’ll get here a huge, long file, that is very hard to analyze and deal with.

Therefore, some of the code in this tool handles this use case and switching the Tableau data to a more readable format, with Page on the most left column and all KPI’s broken to different columns.

This is far more convenient, one might argue, to compare pages metrics, sort, and filter, when getting the data in this form. And it enables us to blend other page’s metrics here, on a page level, and get it all together, in one file.

Revenue Data
* The figures in the table above are not real and used only for demonstration

Now, this probably can be sorted in other, perhaps even less technical ways, but for me it was just an excuse to why I’d like to automate it :)

Once I cleaned and blended the data from all sources, I got this output:

SEO Data
The figures in the table above are not real and used only for demonstration.

* Vertical: consider this as ‘Category’ for simplification (vertical is a private case in MY data) * Page Types include: Article page, Review page, Blog page, PPC page, Other

Weight & Weighted Avg EPV – are not in the scope of this post, but in few words, it allows you to accurate your comparison by transforming absolute data to relative data.

For example, a page with a very high EPV for only a portion of the traffic, does not mean a lot when you’re trying to compare pages based on their EPV.

Additionally, I created few graphs from this data, just to be able to better understand the big picture, and clearly share what I’m trying to achieve here with other stake holders and higher management.

Here’s an example of two scatter graphs that shows the correlation between revenue and position (on the left) and the correlation between revenue and visits. You can play with your data and build other interesting visualization to demonstrate ROI.

Page Revenue by Position / Visits
* The figures in the graph above are not real and used only for demonstration

What action items I took?

With this final version I got above, I started analyzing based on different data segmentation, to try and get first few action items, in order to increase organic traffic revenues.

I might need to dedicate a separate post for the optimization process, because I have a lot to share on that aspect, but just to give you an idea around few of them. I started with:

  1. Optimize pages with high visits & low revenue

  2. Optimize pages with high Gbot hits & low revenue

  3. Optimize pages with high visits & (relatively) low Gbot hits

  4. Quick win – focus on pages with traffic and revenue potential, ranked in avg. position 11-20 (second page) – to first page. Here you’ll have to dive into keyword analysis to figure out which one are relevant.

  5. Improve / add CTA elements on pages with low revenue, test and optimize more.

  6. Improve internal linking to important / money pages.

  7. Consider bringing up ‘rising’ traffic pages to main navigation / above the fold / sidebar widget.

  8. Run keyword and competitor research for those pages to improve existing content and find ideas for new content you’re not targeting yet.

How I prioritized the work?

Once I had a list of all pages I’d like to optimize, I used a prioritization technique I picked up from a good friend just recently. Now, there are many other ways to prioritize, but I chose this one because of its simplicity.

In this technique we transform absolute data to relative data, by using Python NumPy random choice:

h_data[‘MIN’] = [np.random.choice(h_data.columns[x == x.min()], 1)[0] for x in h_data.values]

Absolute Data

Relative Data

The lower the Total Points get – the higher is the score, and that’s where you’ll start your optimization process.

And, for the fun part – the Python code

In terms of Python coding, this tool focus on the revenue data (Tableau in this case), since I’m already using two other SEO automation tools that I previously developed, to bring data from log files analyzer and Google Search Console data fetcher to this tool.

Nevertheless, code is important, and since I’m doing some manipulation on the log files and GSC data as well (after bringing it from the other tools), it makes sense to deep into it.

But first… few things to keep in mind

  1. The tool is semi-automated, as for most parts I’m not fetching data from my cloud or any API, but from csv files;

  2. Tableau – a csv file that I’m downloading to my local Python project.

  3. Log files and GSC – csv files generated automatically when I run those two tools mentioned above. (GSC fetcher tool does pull data using API).

  4. ‘Charticle’ / ‘Superlist’ data points appearing in the code are a private case, so you might want to ignore them. They have have no effect on your ability to run this code properly.

  5. In general, the code needs to be refactored, and is far from being prefect in terms of structure, syntax and efficiency, but it works and that’s what matters!

Code Structure

The code is broken to 4 Python files:

  1. – fetch Google Search Console data

  2. – fetch log files data

  3. – fetch Tableau data

  4. –collect data from all the above, clean it, merge and output to Excel file.

Google Search Console Data –

  • Import GSC data from a csv file

  • group by page

  • get latest position (date-wise)

import pandas as pd

def get_gsc_data():
    path_gsc = 'data\\source-gsc.csv'
    gsc_column_names = ['Clicks', 'CTR', 'Impressions', 'Position', 'Page', 'Date']
    df_gsc = pd.read_csv(path_gsc, names=gsc_column_names, index_col=0)
    df_gsc['Page'] = df_gsc['Page'].str.replace('', '/')
    df_gsc['Date'] = pd.to_datetime(df_gsc['Date'], errors='coerce')
    return df_gsc

def group_gsc_by_page(df_gsc):
    df_group_by_type = df_gsc.sort_values('Date').groupby(['Page']).agg(Position=('Position', 'last'))
    return df_group_by_type