We lead a test on target, to discover how does mbox parameters works :
we would like to remove those parameters from our global mbox and avoid clutter,
Do we have to create another mbox ?
Is there any other solution to discard those tests parameters ?
Is the problem solved ?. If not, Please let us know so that we can further help you with the same.
Thanks & Regards
Hello the problem is almost solved,
I realized there are missing points on my initial question:
First we wanted to leverage the power of ML Random Forrest algorithm in our Adobe Target
Therefore we wanted to use some insights we learned from our clients throught DMPs and find out
the discriminant values in thoses variables. But - using Global Mbox to fire up the experience we faced of
a problem :
Most discriminant variables were screen size, browser , or mobile / desktop information .
This was due to the fact that our current experience was far more efficient on mobile than on desktop,
This lead us to 3 conclusions :
- Machine learning was learning on wrong value
- Machine learning was eaysilly prone to platform impact
- Never setup a ML Experience without having the first tried an A/B Test
and went through analysis
So how did we solve the problem ? Easier than we thought but a little bit tricky also :
By the way the part on Deploying custom mbox through DTM is de facto not
the easiest path to setup an industrialized means of production,
So you added parameters to your global mbox and now you want to remove them? How did you add them--via the DTM UI or with your own custom code? I would just remove the changes you made to add them in the first place. You don't need to create another mbox.