Expand my Community achievements bar.

Join us January 15th for an AMA with Champion Achaia Walton, who will be talking about her article on Event-Based Reporting and Measuring Content Groups!

Types of Reporting

Avatar

Level 10

Analysis is the bridge that connects a mountain of information to meaningful insights. One essential aspect of analysis is the generation of analytic reports, which serve as the basis of the analysis. There are different kinds of reports. For example, there are monitoring reports, test reports, inferential reports, projections, dashboards, sponsor reporting, diagnostic reports, etc. They have different purposes. However, and very broadly, most reports you pull from Adobe Analytics will fall into two types: monitoring reports and analysis reports.

Monitoring Reports

Monitoring reports are designed to do just what their name suggests – monitor and identify changes in key metrics and dimensions. They look for change. Monitoring reports tend to contain the dimensions and metrics for your site or product that you look at all the time. This differentiates them from reports that are meant to answer specific, short-term questions or reports that are meant to reveal user behaviors. Monitoring reports are the surveillance camaras that keep a close eye on the metrics that matter most to a business.

Changes detected in a monitoring report serve as a trigger for further investigation. When an unusual fluctuation is observed, the next step is to dig deeper and understand the underlying causes of that change. These reports are not necessarily actionable on their own; rather, they direct your attention to where it is needed.

A well-structured monitoring report typically contains the key metric drivers for a specific part of the business. For instance, if your business's success depends on repeat use, the monitoring report will prominently feature this metric. Additionally, it may provide basic segmentation to understand the factors contributing to the overall figure. This segmentation might include repeat use by site section, application, or other functional area. However, it's crucial to limit the amount of detail as too much information can hinder the report’s effectiveness.

The report is often presented in a trended view and/or in comparison to other similar site elements. This provides context. Context helps you gauge the significance of any changes observed. The trended view compares the thing being tracked to itself over time. Comparing the item of interest to other similar elements provides a relative value benchmark.

Monitoring reports are typically provided on a regular schedule, such as monthly, weekly, or daily. The frequency depends on the needs and the specific goals of the business. If the information isn't required on a regular basis, it might not fit into the category of monitoring reports.

To do apples-to-apples comparisons the report needs to be consistent over time. It should track the same dimensions and metrics in August as it does in January. Because the reports are consistent, they are often automated. As you can probably guess, a dashboard is usually a form of monitoring report.

Analysis Reports

Analysis reports, on the other hand, are where the deep work of understanding takes place. The Adobe Analytics tool is designed to help facilitate this type of analysis by Analysts, Marketers, Product Managers, etc. Although organizations and experts have many ways of approaching analysis reporting, there is a basic similarity in their process which consists of two primary stages:

The first stage is exploratory. Imagine a paleontologist faced with an entire mountain – which rock should they investigate? Analysis is time consuming. It's simply not feasible to explore everything. Exploratory reporting uncovers patterns and relationships in the data that may be worthy of further exploration.  It is the guiding hand that points you in the right direction. You may have already been pointed in a direction by seeing a change in your monitoring report or a colleague asking a question. And your job may have already limited the scope of your inquiry. (For example, if you are tasked with improving internal search click-through, you might decide not to worry about SEO landing pages.) You can also create a list of “brainstorm” ideas (theories), where the list becomes a plan that you can work through for your initial exploration. 

You then pull reports for the items on your list. Many of these ideas may not show anything of interest in the reports. If there is nothing of interest, move on to the next item.  If there is something of interest, put a pin in it and move to the next item. When you are done, you will hopefully have some pins (“Look Here” pointers) that are your candidates which you can then prioritize for the next level of analysis, the deeper dive.

The purpose of your deep-dive reporting (the second stage) is to help you get a clear picture of the behavior reflected in your data. In this stage, you will often be pulling multiple reports and manipulating the data in Workspace, Excel, or other custom tools. The reports underlying the analysis tend to look at many dimensions and metrics in combination to reveal the activity.  Thus, the reporting tends to be complex compared to monitoring reports. Furthermore, the reports are often unique rather than being the same report that you look at every week. Unlike monitoring reports, the result is intended to be highly actionable.

The analysis that is based on the deep-dive reports can tell you what happened, how it happened, and so what if it happened. From this you often infer “why” a user behaved in a certain way. (This inference is the “art” part of analytics.) A further extrapolation may infer user “intent”. You can think of metrics as dinosaur footprints from which you, the paleontologist, derive dinosaur actions, then behavior, then goals. It is that analysis, not the report itself, that reveals the insight.

The goal is to modify user behavior to achieve desired outcomes. Over time, you can use the analysis to build a model of the user behaviors and the levers that affect those behaviors. Then you can use that understanding to predict or encourage those desired outcomes.

In summary, monitoring reports keep a vigilant eye on key metrics and dimensions, directing attention to changes that warrant investigation. In contrast, analysis reports involve deep-dive exploration, aiming to understand the "how" and "why" behind user behavior and drive actionable insights. Understanding these differences allows you to better leverage the strengths of each report type.

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

0 Replies