Expand my Community achievements bar.

Adobe Analytics Tool Adoption

Avatar

Level 10

There is Analysis. Then there are non-analysis tasks such as report generation, tool configuration, verifying how Adobe Analytics (AA) counts, etc. And there is tool adoption. Tool adoption is getting people within the organization to use Adobe Analytics to make better business decisions. You and Management want to see better adoption as an indicator that the Business is making fact-base decisions. Management also wants to see better adoption as a value measure for the tool: AA is expensive, staff better be using it.

  1. How can you increase tool adoption?
  2. How can you measure tool adoption?

The point of this story is to elicit your own tool adoption efforts/stories. I hope you will weigh in using the comments.

How can you increase adoption?

We have a distributed metric model where users can self-serve rather than waiting for the Analyst’s list of things to do to catch up with their request. What are our users’ impediments to adoption?

Staff is busy and learning AA is usually not their main job. And as easy as Workspace is to use, you have to learn how to use Workspace. The learning curve is more like driving a car than riding a bicycle. (Admittedly Adobe is a Maserati, not a Toyota, in the range of Analytic tools). Even if you learn how to use the tool, it’s just the tool. Analytics can be complicated. You still need to know what to do with the information you get from the tool. This all creates adoption “friction”.

Most of our efforts to increase adoption revolve around training and ongoing communication with users.

  • We provide ideas on analysis techniques that that can provide insight.
  • We try to make learning the tool as easy as possible by providing training and documentation.
  • We provide example reports/templates that show off what can be done in the tool.

Here are some specific things we have done:

  1. To be able to use the tool, users need to understand the data architecture and what information is available. We created a full library in Confluence that includes every Adobe dimension/metric and every custom dimension/metric/segment. (In addition to our more concise standard data dictionary). The documentation is specific to the use cases of our business. That documentation includes:

    The definition of each dimension/metric/segment
    How it is used to provide insight
    Example report screenshots
    Related Reports/Variables
    How the variable is implemented/configured
    Where the data elements come from in our system
    The format of the values
    Who defines/owns the values?

    RobertBlakeley_0-1694009319391.png

  2. We use cliff-note versions of the above documents in Adobe’s new, inline version of a data dictionary. This makes the information easily accessible at the point of use.
    RobertBlakeley_1-1694009360224.png

  3. We created tutorials for Workspace using real site data and business goals (in addition to other topics such as how to do Analysis). We also have pointers to Experience League documents and links to Adobe training videos.
    RobertBlakeley_2-1694009360225.png

  4. We send out short, weekly tip emails to all AA users on how to use Workspace or how to do analytics. (These are like the Adobe Tips and Tricks sessions at Summit conferences). They are mostly excerpted from our Confluence documents (see above) but include Workspace feature updates as they occur and are relevant.

    Many of these Tips don’t change that much over time. When we get through the list we start over, updating the tip from the last time it was sent. (It also saves time when not creating copy from scratch). There is a lot to know about AA. It takes about 2 years to cycle through our list of tips. Repeating Tips serves both as a reminder and catches new staff.

    RobertBlakeley_8-1694009553985.png

  5. We create product-based metric documents. These include implementation instructions for developers and, important for tool adoption, reporting plans reflecting the business goals for that product. The reporting plan provides specific instructions on how to pull the requested metrics for that specific product.
    RobertBlakeley_4-1694009360235.png

  6. We have done live training, both in person (pre-COVID) and Zoom. Most of these were scheduled, trainer-guided sessions. We also tried some drop-in, open-question sessions.

  7. We have created company specific templates for users as starting points for their own Workspaces. They are housed in the Workspace Company Folder. These are an opportunity to display Workspace capabilities.
    RobertBlakeley_5-1694009360237.png

Notes:

A. The live training was not all that effective. Retention of the information was not very good and attendance was mostly middling.

B. A limitation is getting people to actually go to the documentation or read the Tip emails. While we have a distributed metric model where users can self-serve, there is always pressure for the (expensive) dedicated analysts to just provide “answers”. You could view this as a “You can lead horses to water, but not make them drink” issue. But that is not a solution.
 

C. All Workspace users are not the same. We have over 600 AA users. The documentation and tips need to speak to the many different audiences that consume the metric data, each with their own interests and needs and skill levels. Useful information will be different for different levels of organizational responsibility and how a given department needs to understand the site. Aside from different job descriptions, there are different personality types. For example, some people are more naturally “tool users” who want to know the details of how a tool works and others have less interest in the tool itself and may feel a lot of detail involves “too many words”.

RobertBlakeley_6-1694009360238.png
As marketers we would want to tailor different messages for different audiences. However, creating and maintaining multiple versions of the documentation is too costly. The best we can do is write the documentation with multiple audiences in mind. This turns out to be hard to do.

D. Staff turnover really impacts adoption. If a user leaves the company, the new person starts over on the adoption curve (most new hires seem to “have used Google”). If your company’s average turnover rate is high, that’s going to hurt your AA adoption rate.

How can you measure tool adoption?

The question comes down to how and how often people are using Adobe Analytics. To answer this, I use the Adobe Analytics Logs. Admin > Logs > Usage and Access Logs.

RobertBlakeley_7-1694009360252.png

 

You can download 3 months of data at a time. For a year’s worth of data, that’s 4 downloads. I then drop the files into a single Excel worksheet.

 

You can treat Adobe Analytics use just like you would the traffic for any website. For example, you can look for a count of unique users, report views per user, logins per user, frequency, recency, top dashboards viewed, % of users on Workspace vs. Reports, % of total users who pull a report broken down by month, number of reports viewed by day of month, etc. If you know what you are paying for AA, you can calculate Cost/Report View.

 

You can get more granular if you have a Workspace naming convention. For example, the work-group I work in names all its business-facing dashboards starting with “ConsBI-“. I can then easily identify these in the logs and see if they are being used.

 

There is also an API you can use to extract log information (which I have not gotten around to trying). If you are using the API, I’d think you could import that data into an AA report suite and automate that import. Then you could use Workspace to analyze Adobe Analytics usage. Now there’s a thought.

 

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

3 Replies

Avatar

Community Advisor and Adobe Champion

I love this post! My favorite is having an expanded organization-specific data dictionary in Confluence, and I will definitely be using that idea. 

I do agree that live training can be overwhelming and ineffective as a standalone. For our org, it is great as a preview for folks to see what they can do in Adobe Analytics, and we find that if we supplement it with blog posts (for step-by-step instructions, use case examples) and office hours, people are more inclined to use the tool more.

Thank you for sharing!

Avatar

Community Advisor

This is great. Thank you for taking time and posting this valuable info. I am sure this is useful for many people who want to start using AA and who already using it.

Avatar

Level 10

Christel Guidon has created some operational/FAQ dashboards. Excellent idea. See her post on those here: https://experienceleague.adobe.com/docs/analytics-learn/tutorials/administration/admin-tips/create-o...