from last 12 month launches are lesser than unique visitor . in which scenario this is happens .
even background data is not too much.
As per observation it not working properly for Android but ios its working good.
Please provide me better solution.
There can be many reasons behind this, for example:
As per your given points .
We already implemented lifecycle events in Android and ios .
Android and ios have same implementation.
I have no added extra filter in workspace
Is it possible that you have older versions of Android app that don't have Lifecycle metrics included?
In my experience iOS users adopt new versions of apps sooner, where Android users have a tendency to not update unless they are forced to update...
How in-depth have you tested your app? Do you have AEP Assurance installed? Are you using proxy testing like Fiddler or Charles? (Odd are, the security measures in Android are probably blocking Android testing, our only way to properly test is using AEP Assurance)... but during your testing, you need to make sure that your Lifecycle metrics are properly firing.
You should also try breaking down your data above to check what versions are showing, and see if there are any major version numbers missing Lifecycles.
Even when Android and iOS have the same implementation, there can be errors that only present in one OS.
We have React Native apps that control both iOS and Android, but have experienced issues relating to Lifecycle in just Android in the past, or issues with both, with completely different behaviours.... this is because there are often "extensions" to control the Android or iOS code variations and those can cause some issues to present in only one os.
We employ Assurance for testing, and according to our testing, launches are functioning normally. However, there seems to be some disparity.
As advised by Adobe, turn off background tracking.
But according to AA workspace, there isn't a lot of background data.
What happens if you look at Launches and App Users and Visits (and any other key metrics you think might be helpful) and have the first breakdown be "Operating System Types" (like you have), then break those down by "App Id" (this should be a standard metric that captures your app name / version / and build number.
Do you see any instances where a specific version of the app has Users / Visits but is missing Launches?
Also, I am not sure about your foreground/background tracking... but if I do know from experience that if you have background tracking still running (as in you didn't pause analytics when the app is sent to the background), then your Crashes metric will be almost 100%... and the app can't distinguish "unexpected stops" from people closing the app via "swiping off" when the app is in the background. We had that issue on our initial app releases years ago then had to do a quick fix to get rid of that so that we could actually use our crash data.
Is there any specific data you are trying to capture when the app is in the background? It seems to me like the app should be paused and not doing much of anything anyway..... the old SDK 4 paused and restarted the Lifecycle automatically... not sure why version 5 has us manually control this...
Hi jennie ,
I use filter App id breakdown of the AA workspace .Jennie i thought, I saw something peculiar in this.
This month, the number of launches and users of the oldest versions of apps has declined, while those for the most recent apps and their users are doing well.
However, my problem is that whenever a new software version is released, the user count first appears to be high but then gradually declines over time. I'm not sure why this occurs.
Just to clarify:
Are you saying that the newest version always looks good, and old version don't... and if you look at previous months of data you see the same behaviour?
As in.. this month version 1.5 is the newest and it looks fine, but 1.4, 1.3 and 1.2 have issues...
But if you look at last month, 1.4 looks fine, and 1.3, 1.2 and 1.1 have issues?
Or are you saying that version 1.5 looks okay, and all previous versions have issues, but you are seeing a decline of use on 1.4, 1.3, etc?
Because you are looking at User data, you should see all the versions that each user used in your time frame... but older versions usage should fall off as time passes and the users upgrade their apps...
What I am seeing in your screenshot, is that the newest version (I assume) seems to have the best (and most realistic ratio), while the other versions have issues... it may be that older versions of your app weren't tracking Lifecycle metrics consistently and it let to the results you are seeing... and because you still have users on those old versions, they are impacting the overall average when you look at the data high level....
The actual issue is that everytime a new app version is released, launches/user or launches are initially working properly, but after 15 to 20 days, it automatically dropped, for example, if launches = 20 and app user = 35, As per Fredrik Solution ,indicating that some entry points are missing..
Therefore, I want to examine this case and carry out assurance testing.
Yes, this definitely sounds like you need a deep dive testing!
Oh, I just thought of something... if you aren't pausing and restarting Lifecycle when the app is sent to the background, user who never "close" (fully close, swipe the app off in the background) new Lifecycle launches will never trigger.. this is likely the issue....
As the month rolls over, you will have User data, but no launch associated to it. This will start to the throw your launches per User out of whack (particularly when you are looking at specific time frames).
A lot of people probably don't fully close the app; but just leave it in the background; you will only track the "launch" event the first time the app is opened, then never again.
Android and iOS handle things like deep linking very differently. Even if the implementation should be the same in theory, there can be important differences depending on the way users enter the app. Those are hard to spot during debugging.
Have you tried using entry-dimensions (like Entry Page, if you use the default Page Name dimension, or any other dimension that shows interactions) with the Launches metric?
Thank you. There's some big differences with some Entry Pages. It might be worth double-checking if the Android Activities in your App have lifecycle configured correctly for the Entry Pages with an especially low Launches/User ratio, like the FHCR LP item.