All Collections
A/B Testing
A/B Testing | General Experiments Setup Instructions
A/B Testing | General Experiments Setup Instructions

📝🔍 Discover how to utilize Rebuy's native A/B testing feature to experiment with custom CSS and JS on your storefront! 🛍️✨

Christian Sokolowski avatar
Written by Christian Sokolowski
Updated over a week ago


Rebuy's "General" A/B Testing tool is tailored to boost conversion rates (CRO) by enabling experimentation with on-site customizations. Unlike widget experiments, the "General" A/B testing approach focuses on specific elements and behaviors on the storefront, rather than just at a widget level. For instance, it allows testing scenarios like comparing a Smart Cart against a native cart to determine which performs better, or evaluating the optimal placement for images on the site. 🛒🔍✨



General A/B Testing enhances Rebuy’s A/B Testing suite by allowing Merchants to utilize JS and/or CSS to conduct experiments. To create an experiment, follow the steps outlined below.

Step 1: Navigate to "A/B Testing" from the main left-hand Rebuy menu.

Step 2: Click the “Create New Experiment” button.

Step 3: In the "New A/B Test" modal that opens, enter a name for your experiment in the "Name" box to help you keep track of it. Choose "General" as the "Experiment Type" and click "Add Experiment" to proceed to the experiment editor.


Step 4: After completing the initial steps, users will be directed to the Experiment settings page. The first crucial setting to consider is Page Targeting, which allows users to choose the specific page or pages where they want the experiment to be active.

Once the desired pages have been selected, users can proceed to create their control and variations. It's possible to have one control version and run up to 9 variations simultaneously, providing a comprehensive testing environment. 📊✨

Changes to a Variation are done by clicking the Edit button next to the Control or a Variant. This will bring out a drawer allowing the user to utilize JS or CSS to code changes they would like to test. When Javascript or CSS has been entered, the user’s changes will be indicated on the main Experiment page.

In this example, I am configuring my control to display the Smart Cart and incorporating a console log in the JS section for tracking purposes. For my single variation, I will be excluding the Smart Cart by adding CSS to hide it in the CSS section. Additionally, I will include a console log in the JS section for tracking purposes.


Step 5: Select the Experiment Goal (Revenue or Conversion Rate). Scroll down to the "Experiment Goal" section. The selected goal will determine the criteria used to determine the winner of your experiment.

The default option is "Revenue," where the control or variation that generates the most revenue will win. The "Conversion Rate" option determines the winner based on which variation (or control) that leads to the highest percentage of order conversions.


Step 6: (Optional) To set a specific test duration, navigate to the "Test Duration" section. If you prefer manual control over the experiment's start and end, there's no need to set a specific schedule. However, if you want an automated start and end, utilize the "Start Date" and "Start Time" selectors to initiate your experiment in the future. Similarly, use the "End Date" and "End Time" selectors to specify the desired end time (which must be after the scheduled start).

Remember, you can add or modify the scheduled start or end at any time before the experiment begins or concludes, providing flexibility in your testing approach. ⏰✨

Your date/time will be based off of your default timezone you have configured for your Rebuy account. You can verify your timezone by going to Account > Settings > Time Zone Preference


Step 7: Once these steps have been finished and saved, the user is all set to commence their A/B Test. Simply click the “Start Test” button to begin. Unlike widget experiments, general A/B testing does not require any placeholder IDs, streamlining the testing process for a seamless experience. 🚀✨


If you have set a specific end time for your experiment, it will conclude based on your pre-established settings. Through the vertical ellipsis menu, you have the option to promptly end an experiment or edit it to schedule an end time if needed. Once your experiment is completed, you can access the results in the "completed" tab on the A/B testing dashboard.

Unlike the widget experiments, when the general A/B tests are complete, all variations configured for the testing will be removed from the storefront.


Once the experiment is underway, you can locate it in the "Active" table on the "A/B Testing" page under "Manage Your Experiments." Through the vertical ellipsis menu, you have the flexibility to promptly conclude an experiment or adjust the end time according to your preferences. Performance metrics are readily available in the summary blocks of experiments, accessible under both the "Active" and "Completed" tabs.

Did this answer your question?