A/B Testing Dashboards
The A/B Testing experiments dashboard will outline the performance of your control widget along with each of the variations that you chose to test against. If you have not yet set up your experiment, then head over to the install instructions to get started!
The metrics you see on this dashboard are at a page level and NOT the widget level. If you are interested in seeing your widget specific performance, then head over to your reports page to view the classic reporting.
Page-level reporting is used for A/B testing on ecommerce sites because it provides a comprehensive view of the overall impact and performance of different page variations. While widget-level reporting can provide insights into the performance of specific elements or features, page-level reporting offers a more comprehensive understanding of how different variations impact the overall user experience and conversion rates on an ecommerce site. It allows for more effective optimization and decision-making at the page level, leading to improved performance and better user satisfaction.
Variation Names
These names are the aliases you chose during the installation process. If you did not set any specific alias for each of the widgets or control, then it will default to the widget name.
Traffic
This is the allocated traffic you elected to assign to each variation type. The traffic percentages you have chosen for each variation type will be assigned randomly, similar to flipping a coin. As a result, the traffic distribution among the variations may not be exactly equal. This approach is implemented to prioritize the speedy loading of the widgets. By randomizing the traffic allocation, the widgets can be loaded quickly and efficiently, providing a seamless user experience.
Visitors
The total number of visitors or views for each specific variation is tracked using cookies upon page load. Once a visitor sees a particular variation, they will consistently see the same variation throughout their browsing session, unless they clear their cookies or start a new tab in private browsing mode. This cookie-based tracking ensures that visitors have a consistent experience and allows for accurate measurement of the performance and impact of each variation based on the number of views received.
Conversion
The "Conversion Rate" option is used to determine the winning variation in an A/B test based on which variation, including the control, achieves the highest percentage of order conversions. In other words, it assesses the success of each variation by comparing the conversion rates, which represent the proportion of visitors who complete a desired action.
The conversion rate is based on page views of the widgets or controls. Not the individual widget conversion. If you want to see widget conversion rates, visit the performance reporting section. The "winning" widget is chosen based off of the page load conversion that is shown on the A/B testing dashboard and not the specific widget that made the most conversions during the A/B test.
Revenue
The "Revenue" option is used to determine the winning variation based on Rebuy-attributed revenue per visitor. Rather than comparing raw total revenue β which can be skewed by uneven traffic splits β the system calculates revenue per visitor for each variant and selects the winner based on which variant generates the most revenue on a per-visitor basis.
Revenue displayed represents Rebuy-attributed revenue β the revenue from items added through Rebuy-powered recommendations during the experiment period. This is a subset of total order revenue and gives you a more precise view of how each variant specifically impacts Rebuy-driven sales.
Per User
The "Per User" metric shows Rebuy-attributed revenue divided by the number of visitors who viewed each variation. This is the primary metric used to determine the winner when Revenue is selected as the experiment goal, as it accounts for differences in traffic volume between variants.






