OpenText Optimost is designed to help companies deliver engaging, profitable websites and campaigns and includes self-service capabilities. Optimost also provides white glove consulting to help companies test confidently when the stakes and complexity are highest; immediately when speed is of the essence, and to match the perfect content to every customer.
N/A
Optimizely Web Experimentation
Score 8.6 out of 10
N/A
Whether launching a first test or scaling a sophisticated experimentation program, Optimizely Web Experimentation aims to deliver the insights needed to craft high-performing digital experiences that drive engagement, increase conversions, and accelerate growth.
The ease of implementation combined with the managed services result in a tool that virtually anyone can use - implementation is less than 10 lines of code added to the relevant pages of the website (we simply added it to our master page template to have it available on any page) and from there the customer can be as involved or not involved as they wish. At BSI we are very hands on with the testing programme - usually developing and designing the tests ourselves and having HP build them, but if we wanted to HP to develop, design and build and limit our role to QA and review that is an option.
I think it can serve the whole spectrum of experiences from people who are just getting used to web experimentation. It's really easy to pick up and use. If you're more experienced then it works well because it just gets out of the way and lets you really focus on the experimentation side of things. So yeah, strongly recommend. I think it is well suited both to small businesses and large enterprises as well. I think it's got a really low barrier to entry. It's very easy to integrate on your website and get results quickly. Likewise, if you are a big business, it's incrementally adoptable, so you can start out with one component of optimizing and you can build there and start to build in things like data CMS to augment experimentation as well. So it's got a really strong a pathway to grow your MarTech platform if you're a small company or a big company.
Because it is a managed service the need for intervention by our internal IT group was removed. This allowed us to control the pace of the testing programme without being influenced by IT resource allocation
The client and technical account managers are very good at suggesting tests or potential improvements
HP regularly holds custom forums which are always informative and provide an opportunity to learn from and network with peers and industry leaders
The Platform contains drag-and-drop editor options for creating variations, which ease the A/B tests process, as it does not require any coding or development resources.
Establishing it is so simple that even a non-technical person can do it perfectly.
It provides real-time results and analytics with robust dashboard access through which you can quickly analyze how different variations perform. With this, your team can easily make data-driven decisions Fastly.
The dashboard interface is difficult to navigate, but I understand that they are currently developing/testing a new much more user friendly interface
The cost can be a barrier for some organisations, but for us it is worth it. Also they are in the process of releasing a less expensive self authoring testing tool.
We have not only renewed our subscription three years running, but we have added the self authoring tool and are looking to expand the subscription so that we can take advantage of the managed services on a global level.
I rated this question because at this stage, Optimizely does most everything we need so I don't foresee a need to migrate to a new tool. We have the infrastructure already in place and it is a sizeable lift to pivot to another tool with no guarantee that it will work as good or even better than Optimizely
Optimizely Web Experimentation's visual editor is handy for non-technical or quick iterative testing. When it comes to content changes it's as easy as going into wordpress, clicking around, and then seeing your changes live--what you see is what you get. The preview and approval process for sharing built experiments is also handy for sharing experiments across teams for QA purposes or otherwise.
I would rate Optimizely Web Experimentation's availability as a 10 out of 10. The software is reliable and does not experience any application errors or unplanned outages. Additionally, the customer service and technical support teams are always available to help with any issues or questions.
I would rate Optimizely Web Experimentation's performance as a 9 out of 10. Pages load quickly, reports are complete in a reasonable time frame, and the software does not slow down any other software or systems that it integrates with. Additionally, the customer service and technical support teams are always available to help with any issues or questions.
They always are quick to respond, and are so friendly and helpful. They always answer the phone right away. And [they are] always willing to not only help you with your problem, but if you need ideas they have suggestions as well.
The tool itself is not very difficult to use so training was not very useful in my opinion. It did not also account for success events more complex than a click (which my company being ecommerce is looking to examine more than a mere click).
In retrospect: - I think I should have stressed more demo's / workshopping with the Optimizely team at the start. I felt too confident during demo stages, and when came time to actually start, I was a bit lost. (The answer is likely I should have had them on-hand for our first install.. they offered but I thought I was OK.) - Really getting an understanding / asking them prior to install of how to make it really work for checkout pages / one that uses dynamic content or user interaction to determine what the UI does. Could have saved some time by addressing this at the beginning, as some things we needed to create on our site for Optimizely to "use" as a trigger for the variation test. - Having a number of planned/hoped-for tests already in-hand before working with Optimizely team. Sharing those thoughts with them would likely have started conversations on additional things we needed to do to make them work (rather than figuring that out during the actual builds). Since I had development time available, I could have added more things to the baseline installation since my developers were already "looking under the hood" of the site.
We evaluated Optimost again Adobe's similar offering (Target). The big difference between the two and the reason why BSI choose Autonomy was the managed service aspect. The idea that once the code was deployed on the site IT no longer had to be involved gave my team full ownership of the testing programme. With the Adobe product, the involvement of the internal IT group would have been required to launch each test - and this would have decreased the number of tests we could run each month. Back in the day I also used offermatica/omniture and this too required IT involvement.
The ability to do A/B testing in Optimizely along with the associated statistical modelling and audience segmentation means it is a much better solution than using something like Google Analytics were a lot more effort is required to identify and isolate the specific data you need to confidently make changes
We can use it flexibly across lines of business and have it in use across two departments. We have different use cases and slightly different outcomes, but can unify our results based on impact to the bottom line. Finally, we can generate value from anywhere in the org for any stakeholders as needed.
Use HP Optimost was the primary driver behind a 40% increase in UK classroom training courses booked online read more details here: http://www.autonomy.com/work/news/details/hsx6767d
HP Optimost testing led to a 9% increase in sales by improving the BSI Shop's checkout funnel in 2012
HP Optimost is integral to the success of BSI's continuous improvement testing programme
We're able to share definitive annualized revenue projections with our team, showing what would happen if we put a test into Production
Showing the results of a test on a new page or feature prior to full implementation on a site saves developer time (if a test proves the new element doesn't deliver a significant improvement.
Making a change via the WYSIWYG interface allows us to see multiple changes without developer intervention.