A/B Optimization is a useful concept you may already be familiar with. The idea is to figure out which version of a landing page, website, etc. is best ("optimal") for achieveing a desired outcome. Generally speaking this outcome is measurable as a conversion. So for example, you may have two landing pages which differ in one detail, one has a red checkout button and one has a green one. You want to figure out which is best for generating conversions by testing against real live visitors, rather than relying on your gut instincts. To do so you'd set up an A/B test as follows:
1. Set up a conversion tracking link pointing to the first landing page
2. Set up another conversion tracking link pointing to the second landing page
3. Set up a traffic splitting ruleset to divide the traffic between these two conversion traffic links equally (50-50).
The traffic splitting ruleset should be concatenated with the conversion tracking links. In other words, one 50% probability rule in the traffic splitting ruleset should have the first Addue conversion tracking link pointing to the first landing page as its destination URL. The other 50% probability rule in the traffic splitting ruleset should have the second Addue conversion tracking link pointing to the second landing page as its destination URL.
If you've gone through the prerequisite tutorials for this tutorial, you should be comfortable enough to set up the above scenario with ease. Don't forget to place the tracking pixels associated with each converstion tracking link on the landing pages, so you can accurately track any conversions they're driving.
With that all set up, you're now ready to rock and roll. Simply point your incoming traffic at the traffic splitting link and Addue will take care of routing things correctly. You'll be able to see any conversions in your conversion activity report in the usual way. Since Addue reports the referring page and the rule ID applied in that report, it will be easy to see which of the two landing pages is generating which conversions, and compare the performance of the two. Downloading to excel may of course be helpful since for the A/B test to be statistically valid you'll need to run this over a decent volume of traffic - probably a hundred conversions or more if possible.
You can of course test out many landing pages at once if you wish, since the traffic splitting rule can split traffic amongst as many destination URLs as you want. If you wanted to check out four possible checkout button colors on your landing page, for example, instead of just two, you could set up a traffic splitting rule feeding four conversion tracking links with equal probability (25% each), with each of those conversion tracking links going to one of the landing pages.
Books have been written on A/B optimization and testing, and it's possible to learn alot more about the theory behind it, including exactly how much traffic/conversions you should run for conclusions to be valid with required degrees of confidence. Common sense however can take you far. The more traffic and conversions you observe, the greater degree of confidence you can have when making a judgement about which website or landing page is best. Making judgements based on just a few observations is likely to lead to the wrong conclusions.
Another important point - if you make several changes between landing pages, you might not be able to pinpoint exactly which of the changes are driving the better performance of one page vs the other. Changing one variable at a time is always best.