TrustRadius
Useful for simple changes, but developers needed for complex tests
https://www.trustradius.com/ab-testingVWOUnspecified7.7172101
No photo available
March 01, 2019

Useful for simple changes, but developers needed for complex tests

Score 7 out of 101
Vetted Review
Verified User
Review Source

Overall Satisfaction with VWO

VWO is used by the UX & CRO department of our digital marketing agency to facilitate running A/B tests on client websites. It solves the business problem of not needing to much developer resource to run tests, allowing our team members to mostly set up tests themselves rather than needing complex code.
  • Quick to set up AB tests on sites, with multiple variations allowed and the ability to use segments to control who sees the test.
  • Easily links with GA for accurate analysis
  • Heatmaps are included which allow you to see differences between test variations
  • Setting up multivariate tests can be very difficult, particularly if you want to test something which requires custom code rather than using the VWO editor.
  • Using the VWO editor to create tests can create excessive amounts of code to do quite simple changes, compared to if custom code written by a developer was used.
  • The heatmaps could be more detailed, and seeing how far users had scrolled would be useful.
  • Integration with more tools, such as Hotjar, would be very handy
  • The ability to offer a CRO service is greatly enhanced by VWO, as it allows us to run tests for clients
  • The editor means to some extent we don't need much developer resource to run the tests, lowering costs of doing so
AB testing has benefited my organisation, due to being able to run AB tests for our clients, the basis of our CRO offering. However, testing hundreds of changes simultaneously through multivariate testing has not benefited my organisation particularly, due to the difficulty we found in setting up multivariate tests. The test took a disproportionate amount of time to set up, needing lots of developer resource so costing a lot. Tracking conversion goals via VWO can be tricky, as they can be difficult to set up, so we prefer to use GA to track the results of the test. We also find GA to show more accurate results. Likewise, due to not using VWO to track conversion goals, we use third party tools for understanding the statistical validity of results.
It's handy being able to set up custom segments to set tests live to IP addresses for clients to view them. We've also used them to exclude specific browsers, or target certain devices. However, once we experienced difficulty with targeting paid search traffic using the pre-made segment. It only captured traffic with the medium of paid search, and didn't account for Google Ads using gclid to autotag paid traffic (or Microsoft's msclkid). So we ended up with no visitors in the test at all.
It's enabled our team to create simple tests themselves, reducing the need for developer resource. However, it spits out a tonne of code for even the simplest of changes, which can cause issues for the site. Therefore we tend to ignore the editor most of the time and use custom code written by developers instead. However, the editor is still useful for identifying the class of elements, or an initial play around with layout or whatever to see how something looks when it is changed. But for actually running the tests, we used custom code.
VWO is useful when you need to run AB tests on websites which have been set up well. Problems can arise when the tests are more complicated, especially multivariate tests, or when the website is quite old/doesn't function as well as it should. Then tests can become quite buggy. It can also be difficult previewing tests via the preview tool, as that sometimes doesn't display the test correctly. It is also difficult allowing clients to preview live tests when there is more than one variation. The test can be set live to the client's IP, but to guarantee them seeing a specific variation it has to be set to 100%, so you can only show one variation at a time.