Quick and dirty landing page experimentations
At Educatly, we noticed that our conversion rates by directing traffic to programs on our platform were too low. That's how we realized we had a problem with how information was structured on our program pages.
My Role: Product Manager and Developer
With the help with a designer and a marketer
What was the problem?
Educatly is a platform for young people to find their study abroad opportunity of their dreams. At this time, Educatly wanted to scale in new user acquisition by running ads. What we noticed is that our conversion rates by directing traffic to programs on our platform was too low. That's how we realized we had a problem in how information was structured on our program pages.
We didn't want to spend weeks and months in producing high fidelity designs and implementing it on our platform, but test quickly.
The outcome was a set of landing pages, all created for the same study program. They follow the same structure, but each landing page has one change to facilitate the respective experiment. Some of the experiments were:
- Will highlighting the application deadline have an impact on the bounce rate, time spent or conversion rate on this page?
- Is information about tuition and scholarships important in the decision making for students?
- How will university rankings affect the behavior of the user on this page?
From each experiments we gained valuable insights that we were then able to apply in the final landing page design that we are implementing on our platform.
We went through the program websites of some of our competitor platforms and universities themselves and made a list of all the features and points of information we could find. Depending on where it was on the page we assigned points to the feature. We cross-checked this with the insights we got from customer interviews and surveys. With this, we identified a couple of high-priority information points around which we wanted to run experiments.
Once we had built the main landing page, we launched it with a low ad budget to get our benchmark numbers. Based on these numbers, we decided for the first experiment to run. Within a few days, the designer would provide the user experience. Then I would duplicate the landing page, implement the changes, and pass it over to the marketer. The marketer would launch ads on Facebook for this new landing page for a couple of days and then we would regroup to analyze the results. based on the numbers we would either decide whether the experiment was successfully concluded, whether we want to re-run it with changes, or whether it failed. Then we decided which experiment to run next, and kept repeating the process.
Each design that was produced would focus on one small minimal change without affecting the design of the rest of the landing page. This allowed us to measure the effect of an isolated change, as well as to implement it quickly.
My main learning was that despite us generating tons of insights about our users, and even generating leads, we were not able to easily transfer the learnings to our actual platform. We did not include our design agency in the process - so when it came to implementing it, it was hard for them to make product decisions independently.
We picked a program for this experiment for which we had high quality information available. On educatly.com, we have thousands of programs, and not all of them have the same amount of information, so it made it hard to build designs that considered that and maintaining our benchmark in conversion rate.