I don't know of any white papers, per se.. but when you say "variances between backend reporting systems" I assume you mean between the stats collected by Adobe and other systems (either local database, or other).
First off, every system works slightly different, so that alone is going cause some differences. Then you likely have differences in your Bot Exclusion Rules, which will account for more differences. There is also the potential for user opt-out (either through potentially your own opt out options, but also browsers and browser extensions blocking different client side analytics, potentially people disabling JS or random JS failures on the client side tracking, latency on the tracking allowing people to potentially leave the page prior to tracking firing, etc.
Remember that web analytics is less about the "precise" data, and more about trends. There will always be variances, but once you have baselines on the differences you can watch that the different systems that they all go up, down, or stay constant in and around a similar amount.