Hello,
I have been running a paid social media campaign and have noticed a large discrepancy between what appears under Adobe Analyics workspace and what I receive from Bitly/LinkedIn.
On Adobe Analytics, I created a segment that includes the custom url of the campaign I've been using specifically for the paid ad (see below).
In total Adobe Analytics suggests I've had 10 unique visitors, however on Bitly, this figure increases to 148 clicks. I have taken into account that I'm measuring unique visits vs clicks, therefore I should expect some discrepancy, but this seems quite high. On LinkedIn, it shows that I've targeted the same subset of audiences no more than once with the ad, so I shouldn't expect total number of clicks to be higher.
Has anyone else had a similar experience?
Solved! Go to Solution.
Views
Replies
Total Likes
This is a very common digital Click vs Open when comparing 2 separate reporting tools. In this case Linkedin and Adobe Analytics.
1st item to capture a unique visit you need a person to visit and have a complete page load that captures and sends a beacon to Adobe Servers. this means you need in most cases both header and footer to have fired completely.
From the add serving side in this case linkedin. A click is as its name implies capture when a link or button is clicked. Whether its a human or a BOT a click will be registered. Now if from time a click on such a link is made and by the time that user is directed to your website they happen to close browser, stop your pageload or happen to be a headless browser user. Then there will be no proper tagged session generated on your page and site.
Some users may also block cookies/code as well so they will indeed visit and just not be tracked.
So in your case its a 148 clicks to 10 UVs. I would first realign the metrics to be clicks to visits. this should decrease ratio but its still not unusual to see a 30 to 70% ratio difference.
What can you do to help remove the bots. Digital ad servers may have a bot removal function but they still lets too many bots through what I like to do is when possible. Is to delay the Pixel fire of said digital ad provider. What i find is delay pixel from firing until user has been on page 2 to 3 sec. This will remove many of the bots as well as ensure you are getting engaged users as most people cannot read a full pages forth of content in under this time.
Incredibly as you do you will see this technique will get your numbers much more inline to what actually visits your website like Adobe Analytics shows. Also there is viewable impression and click rates of Ads see if you can have those reported on as they also can help you measure more real users.
FYI. In every case I have experienced when I did a deeper server log analysis of such click throughs i saw alot of BOT behaviour of such requests on our website.
GLTU
This is a very common digital Click vs Open when comparing 2 separate reporting tools. In this case Linkedin and Adobe Analytics.
1st item to capture a unique visit you need a person to visit and have a complete page load that captures and sends a beacon to Adobe Servers. this means you need in most cases both header and footer to have fired completely.
From the add serving side in this case linkedin. A click is as its name implies capture when a link or button is clicked. Whether its a human or a BOT a click will be registered. Now if from time a click on such a link is made and by the time that user is directed to your website they happen to close browser, stop your pageload or happen to be a headless browser user. Then there will be no proper tagged session generated on your page and site.
Some users may also block cookies/code as well so they will indeed visit and just not be tracked.
So in your case its a 148 clicks to 10 UVs. I would first realign the metrics to be clicks to visits. this should decrease ratio but its still not unusual to see a 30 to 70% ratio difference.
What can you do to help remove the bots. Digital ad servers may have a bot removal function but they still lets too many bots through what I like to do is when possible. Is to delay the Pixel fire of said digital ad provider. What i find is delay pixel from firing until user has been on page 2 to 3 sec. This will remove many of the bots as well as ensure you are getting engaged users as most people cannot read a full pages forth of content in under this time.
Incredibly as you do you will see this technique will get your numbers much more inline to what actually visits your website like Adobe Analytics shows. Also there is viewable impression and click rates of Ads see if you can have those reported on as they also can help you measure more real users.
FYI. In every case I have experienced when I did a deeper server log analysis of such click throughs i saw alot of BOT behaviour of such requests on our website.
GLTU
Thank you Pablo, this makes more sense now. I did think trying to measure two different values would complicate things, thanks for the suggestion.
Views
Replies
Total Likes
Views
Likes
Replies