top of page

In addition to designing, I own the end-to-end CRO testing process at CubeSmart, ensuring every design decision is validated by data. I combine rigorous statistical analysis of core KPIs (e.g., conversion lift) with thoughtful qualitative behavioral insights to ensure that successful tests not only increase conversions but organically improve the user experience.

The Bold Filter & Sort Test

Company: CubeSmart

1. Filter & Sort-ing Out The Details

Initial analysis of mobile customer behavior revealed a significant pattern: mobile users who engaged with the Filter & Sort button had a substantially higher conversion rate for self-storage rentals compared to those who did not.

Key Data Insights (Mobile ONLY):

We observed a significant uplift in successful rentals (conversions) across two key areas where the Filter & Sort button was available:

2. Hypothesis & Testing Method

Based on the low engagement data, we hypothesized that the primary barrier was low discoverability.

Hypothesis: By increasing the visual contrast and prominence of the Filter & Sort button with a bold color treatment, we will significantly increase user interaction with the feature, thereby leading to a measurable uplift in successful storage rental conversions.

​Experiment Setup

  • Test Type: A/B test

  • Audience: 50% of mobile users on a City or Facility page.

  • Confidence level: 95%

  • Primary KPI: successful Filter & Sort interaction rate. 

  • Secondary KPI: storage rental conversion rate.

  • Duration & Tools: 3 weeks (based on pre-test traffic analysis for statistical significance).

3. Variant Design

The core goal was to make the feature visually compete for attention. I explored several color options but chose to use CubeSmart's signature brand red for the button's background.

  • Visual Hierarchy: Red is the color of the primary "Reserve Now" CTAs, ensuring the Filter & Sort feature could effectively break through the existing visual noise.

  • Accessibility: The chosen red/white color combination satisfied WCAG AA accessibility standards for contrast, ensuring a usable design for all users.

Control

Test Variant

Implementation

The Variant was implemented using the Monetate platform to override the existing site CSS, applying a new style to the button that included a red background, white text, and white icons.

4. Winning Results!

After the 3-week test period, the Red button variant achieved statistical significance and proved to be a HUGE success across both primary and secondary metrics! When comparing conversion rates of users who interacted with the F&S button, the experiment group saw:

79.8%

lift in conversions

(from the faculty page)

Key Learning and Next Steps

The results confirmed our hypothesis: the low-contrast Filter & Sort button was a major barrier to conversion. A bold, highly visible design successfully captured user attention, streamlining the storage unit selection process and improving user experience.

 

The red button design was rolled out as a permanent update across the entire site for mobile users.

Map Swap Test

Company: CubeSmart

1. Switching The Details

Initial analysis of desktop customer behavior revealed an interesting observation: desktop users who engaged with the inventory drop-down lists on our City Pages (where users can see a list of storage facilities in the area) had a substantially higher conversion rate for self-storage rentals compared to those who did not.

Key Data Insights (Mobile ONLY):

2. Hypothesis & Testing Method

Based on the low engagement data for those who did not use the drop-down, we hypothesized that the primary barrier was the facility map, which was the first element on the left side of the page. This map did moderate engagement; however, it did not compete with the high conversion rate of the see units drop-downs.

Hypothesis: By swapping the placement of the facility list and storage map, we will significantly increase user interaction with the unit list, which will lead to an uplift in successful storage rental conversions.  

​Experiment Setup

  • Test Type: A/B test

  • Audience: 50% of desktop users on a City page.

  • Confidence level: 95%

  • Primary KPI: Drop-down interaction rate. 

  • Secondary KPI: storage rental conversion rate.

  • Duration & Tools: 6 weeks (based on pre-test traffic analysis for statistical significance).

3. Variant Design

Since the hypothesis was very clear on swapping the map and facility list, there was no room for design interpretation.

Control

Test Variant

Implementation

We originally attempted to design the test variant by using the simple swap element feature in the Monetate platform. We discovered that a rudimentary swap broke our site, so we had to pivot by manipulating the site’s CSS directly. After some careful coding and thorough QA, I was able to get the desired effect with all functionality intact.  

4. Winning Results!

After the 6-week test period, the test variant’s performance achieved statistical significance, proving to be the better experience for the customer! When comparing our primary and secondary metrics between the control and test variant groups, we saw a lift in bother click interactions and conversions:

4.58%

lift in clicks!

Key Learning and Next Steps

The results confirmed our hypothesis: swapping the placement of the map and facility list was a major improvement to desktop users' ability to interact with the unit list on City pages. Having the list the the first page element on the left led to more engagement and higher rental conversions.

 

Swapping the map and facility cards was rolled out as a permanent update across the entire site for all desktop users.

8.96%

lift in conversions after clicks!

bottom of page