Why we use Maxymiser at our businesshttps://www.trustradius.com/ab-testingOracle MaxymiserUnspecified7.5901012014-07-30T16:41:14.980Z
July 30, 2014
Why we use Maxymiser at our business
Score 9 out of 101
Overall Satisfaction with Maxymiser
Maxymiser is used in conjunction with online analytics tools to help us improve conversion rate on our website. It is used in all markets across the group, with very complex tests in more advanced markets and simpler tests in less advanced markets. In some markets multiple departments use it in others it is used by online department only. It helps us understand the risks of the changes on the website, improve user experience and deliver better ROI of the online activities.
- Very knowledgeable account management and consulting team
- Easy to implement and roll out across the footprint.
- Fully managed service is great if you are just starting with testing
- Comprehensive and forward thinking product roadmap
- Visual Campaign Builder (VCB) still needs work
- The test set up process is great for complex tests but a bit too long for simpler ones
Optimizely, Visual Website Optimizer, Adobe Test and Target, Webtrends, Monetate, Autonomy
Maxymiser is used in conjunction with online analytics tools to help us improve conversion rate on our website. It is used in all markets across the group, with very complex tests in more advanced markets and simpler tests in less advanced markets. In some markets multiple departments use it and in others it is used by the online department only. It helps us understand the risks of the changes on the website, improve user experience and deliver better ROI of the online activities.
Maxymiser is a premium tool with a premium team behind it, thus if budget is an issue- might be better to look at different suppliers to get started. Here is some of the selection criteria that we looked at:
- What is the quality of their suggestions for the first test- did they understand the business and come up with the best KPIs for the test and your company’s strategy, was design tailored to your style guidelines? Was the follow up test recommendations provided?
- How much resource was allocated to the test and account- how quickly are the briefs produced, how many people work on the account, where are the assets hosted, how quickly are the issues rectified, how much time did it take between the brief and the test launch?
- Tag load time- how quickly does the tag load in different geographies, does it cause any problems on any of the pages, does it work with your tag management solution?
- Analytics- does it provide real time analytics? Does it integrate with other data tools used on your site? How easy it is to understand the data? How easy is it to export and work with the data? How many different ways are there to analyse the data (are the filters and segments provided enough for your business)? Were post- test reports delivered on time? What was the quality of the report?
- Interface- how easy to use and understand? The quality of the training provided, how often, was it at the level that the team needed? How easy is it to stop, pause and launch the test? How often is the interface updated, etc (esp relevant if not fully managed service)?
- Roadmap- is it ahead of the competition in terms of product functionality? How relevant is the upcoming functionality to your business? Is it in line with industry trends?
- Team (for fully managed service only)- how approachable and knowledgeable is the team? How quickly they respond to requests? Do they understand challenges in your specific market? How easy is it to work with them? What is the quality of their communication and engagement with you?