Expand my Community achievements bar.

Join us for the next Community Q&A Coffee Break on Tuesday April 23, 2024 with Eric Matisoff, Principal Evangelist, Analytics & Data Science, who will join us to discuss all the big news and announcements from Summit 2024!
SOLVED

Data Analytics on the Quiz Page

Avatar

Level 1

Hi All,

 

We have 15 questions in a quiz page that we want to understand the completion of the quiz and where users drop off at each question, therefore we tag every single question with Adobe Analytics. However currently the report shows some abnormalities such as we got more unique visitors on Question 5 rather than Question 4, and Question 9 rather than Question 11. Which typically doesn’t make sense as the unique visitors should not be increased as the question goes. If anyone encountered this previously, please advise what is the solution to this issue. Thank you.

 

Regards,

Henny

 

 

1 Accepted Solution

Avatar

Correct answer by
Community Advisor

Here's something you might do as well.  Take a look at your Single Page Bounce Rate.  It sounds funny, but use that metric in conjunction with your unique visitors and also create a conversion rate related to the number of quiz answers your users complete and align it to your bounce rate for each of your quiz answers completed.  You might gain some insight into what's going on and where your test takers are dropping out.

Jeff Bloomer

View solution in original post

4 Replies

Avatar

Community Advisor

While in theory you are correct, it would make sense for a natural fall off on the questions.. but users rarely do logical things

 

Look at your form, are some of the questions harder to answer without thinking about what they want to say.. users may skip questions then scroll back up to fill it in when they know what they want to answer... if they hit too many "hard" questions, they may cancel out of the quiz altogether....

 

If you are using raw data feeds, it might be worth trying to look at the exact time stamps (down to the millisecond) of the usage.. see if users are bouncing around what order they answer the quiz in.... 

 

In the meantime.. try running an isolated test in your Dev or QA environment... try different browsers, try on web and on mobile devices... try to see if tracking triggers fail on the questions you fell should be higher....

Avatar

Correct answer by
Community Advisor

Here's something you might do as well.  Take a look at your Single Page Bounce Rate.  It sounds funny, but use that metric in conjunction with your unique visitors and also create a conversion rate related to the number of quiz answers your users complete and align it to your bounce rate for each of your quiz answers completed.  You might gain some insight into what's going on and where your test takers are dropping out.

Jeff Bloomer

Avatar

Level 1

Hi Jennifer, Jeff,

 

First of all, many thanks for your replies it's my first time posting here so it's really heartwarming to receive support from this community.

 

The 15 quiz is designed as such, every page displays 1 quiz question and customer can't skip question (they can't click next to go the next question without answering the current question).

 

I will ask the analytics team to provide me with the time stamps information as suggested by Jennifer. As for the Single Page Bounce rate suggestion from Jeff, I'm not sure what does it mean but I will check and find out. Thanks again to both.

Avatar

Community Advisor

Oh, one question per page... and does the URL allow people to "hack" the parameters to skip ahead?

 

Does your application happen to be a single page application? Maybe your triggers are intermittently failing? Or depending on lag, maybe people are answering so quickly that they are moving past questions without giving proper time for the tracking to fire..... Or maybe there is an intermittent JS error that cascades and causes tracking to fail sporadically?

 

My suggestion would be to try testing in as many browsers, OSs, and devices (mobile, desktop, tablet) as you have at your disposal and try to see if you can catch any issues..

 

Does your team have any automated or load testing tools available? Load testing may not execute your JS... but an automated tool like Selenium or Katalon should... you might want to see about creating a test and running it in a loop on a dev server and see if all the pages are recorded consistently? While this will be the same "visitor" in your data, you should be able to at least verify by Page Views if all pages / questions track all the time and consistently.

 

Good luck, and if we think of anything else we will be sure to post again.