Adobe acquired Omniture in 2009 and re-branded the platform as SiteCatalyst. It is now part of Adobe Marketing Cloud along with other products such as social marketing, test and targeting, and tag management.
SiteCatalyst is one of the leading vendors in the web analytics category and is particularly strong in combining web analytics with other digital marketing capabilities like audience management and data management.
Adobe Analytics also includes predictive marketing capabilities that help…
N/A
Optimizely Web Experimentation
Score 8.7 out of 10
N/A
Whether launching a first test or scaling a sophisticated experimentation program, Optimizely Web Experimentation aims to deliver the insights needed to craft high-performing digital experiences that drive engagement, increase conversions, and accelerate growth.
Adobe Analytics is more expensive than Google Analytics, but Adobe Analytics is also more robust, customizable, and extensive. The level of detail and specificity available in Adobe Analytics is much greater.
Adobe Analytics (or any natural reporting tool), will always be my vote over any testing tool's reports (like Adobe Target & Optimizely) due to a much larger universe of reporting options available to the user. Amplitude has a lot of flashy bells and whistles, and is relatively …
It's a lot more, well, site stacked, it's way better than that. Adobe Target. I think the UI is easier to use on Optimizely. The one thing that I would say comparatively is our analytics talking to each other. Obviously Adobe, we use Adobe Analytics and Adobe Target, so they …
Optimizely Web Experimentation stacks up favorably against Adobe Target, Kameleoon, and Oracle CX Marketing in terms of ease of use, customer service, and features. It is easy to set up and execute experiments, the customer service team is always quick to respond to any …
I have used tools in various spaces that have all the flashy bells and whistles, and is, but lacks some basic features - Optimizely isn't this. While other tools, such as Adobe Target, Evergage, Dynamic Yield, Google Optimize, or even Taplytics may make more sense for your …
Maxymiser - The statistical significance engine used by Optimizely helps to reduce the detection of false positives. These were noticed on many occasions within the maxymiser tool.
Sr. Director, Traffic, Testing and Conversion Optimization
Chose Optimizely Web Experimentation
Prior to Optimizely, we used Test & Target (now Adobe, formerly Omniture and before that, Offermatica). Optimizely is better in every way. It's much easier to use (interface), and it's MUCH easier to have tests run since it requires just one code snippet (where T&T often …
Maybe for a small company with small products for their thing, Adobe may be bit of an implementation too much for them, but when it comes to companies like us, like a life sciences or large enterprises and even small enterprises, but with more products, more analysis that they need to make their marketing experience better, maybe Adobe product is the best suitable.
I think it can serve the whole spectrum of experiences from people who are just getting used to web experimentation. It's really easy to pick up and use. If you're more experienced then it works well because it just gets out of the way and lets you really focus on the experimentation side of things. So yeah, strongly recommend. I think it is well suited both to small businesses and large enterprises as well. I think it's got a really low barrier to entry. It's very easy to integrate on your website and get results quickly. Likewise, if you are a big business, it's incrementally adoptable, so you can start out with one component of optimizing and you can build there and start to build in things like data CMS to augment experimentation as well. So it's got a really strong a pathway to grow your MarTech platform if you're a small company or a big company.
It summarizes large complex data better than any other analytics solution I've dealt with without the need for sampling, gives the right level of detail, does the right level of breakdowns, aggregation. I consistently not only use Adobe Analytics, but I use other data sets and compare against Adobe Analytics. And as I go into Adobe Analytics and compare, as long as I've done the query right and the other systems, they're very, very close. And if anything, with a lot of Adobe's newer products, they've gotten more accurate over time. So that's basically, you asked me what I liked about it. I like that it's accurate. I like that I don't have to do a lot of explaining. There's enough explaining in the world of web analytics to have to go back and explain why data's problematic. And so like I said, provided that the implementation is correct, it's a very easy conversation. Even if people may not like the answer.
The Platform contains drag-and-drop editor options for creating variations, which ease the A/B tests process, as it does not require any coding or development resources.
Establishing it is so simple that even a non-technical person can do it perfectly.
It provides real-time results and analytics with robust dashboard access through which you can quickly analyze how different variations perform. With this, your team can easily make data-driven decisions Fastly.
Support. I mentioned this earlier and we don't know what we don't know. Researching the massive amounts of documentation isn't realistic with bandwidth constraints, and our rep getting frustrated with us when we go through what we are seeing is disappointing.
Education. More please, and designed more towards the "business side". I get with the many many many different implementations (every company is different!), that it's tough, but even a basic of the basics would be nice for situations that everyone is looking at, like the engagement with the merchandising on the home page (or any certain page).
We've found multiple uses for Adobe Analytics in our organization. Each department analyzes the data they need and creates actionables based off of that data. For E-Commerce, we're constantly using data to analyze user engagement, website performance and evaluate ROI.
I rated this question because at this stage, Optimizely does most everything we need so I don't foresee a need to migrate to a new tool. We have the infrastructure already in place and it is a sizeable lift to pivot to another tool with no guarantee that it will work as good or even better than Optimizely
Sometimes the processing times are very long. I have had reports or dashboards time out multiple times during presentations. It could be improved. It is understandable since there is a huge data set that the tool is processing before showing anything, however for a company that large they should invest in optimizing processing times.
Optimizely Web Experimentation's visual editor is handy for non-technical or quick iterative testing. When it comes to content changes it's as easy as going into wordpress, clicking around, and then seeing your changes live--what you see is what you get. The preview and approval process for sharing built experiments is also handy for sharing experiments across teams for QA purposes or otherwise.
I do not ever recall a time when Adobe Analytics was unavailable to me to use in the 8 or so years I have been an end user of the product. My most-used day-to-day analytics tool Parse.ly however, generally has a multiple hours planned offline maintenance every two to four weeks, and sometimes has issues collecting realtime analytics that last anywhere between 15 minutes to an hour, and happen anywhere between 1 to 5 times a month.
I would rate Optimizely Web Experimentation's availability as a 10 out of 10. The software is reliable and does not experience any application errors or unplanned outages. Additionally, the customer service and technical support teams are always available to help with any issues or questions.
Again, no issues here. Performance within the day updates hourly. other reports are updated overnight and available to access by the next morning. Pages load quickly, the site navigates easily and the UX is quite straightforward to get command over. On this front, I give Adobe kudos for building a great experience to work within
I would rate Optimizely Web Experimentation's performance as a 9 out of 10. Pages load quickly, reports are complete in a reasonable time frame, and the software does not slow down any other software or systems that it integrates with. Additionally, the customer service and technical support teams are always available to help with any issues or questions.
I barely see any communication from Adobe Analytics. The content on the web is also not that great or easy to read. I would recommend a better communication about the product and the new addons information to come to its user by a better mean.
They always are quick to respond, and are so friendly and helpful. They always answer the phone right away. And [they are] always willing to not only help you with your problem, but if you need ideas they have suggestions as well.
It was a one-day training several years ago that cost the organization several thousand dollars. There were only about 10 people in the training class. Adobe tried to cram so much information into that one-day class that none of our users felt like they really learned anything helpful from the experience. Follow-up training is too expensive
The online training for Adobe SiteCatalyst consists of short product videos. These are ok, but only go so far. For a while Adobe charged a fee for this, but recently made these available for free. There are many great blog posts that help users learn how to apply the product as well.
The tool itself is not very difficult to use so training was not very useful in my opinion. It did not also account for success events more complex than a click (which my company being ecommerce is looking to examine more than a mere click).
One of the benefits and obstacles to successfully using Adobe Analytics is a great / more accurate implementation, make sure your analytics group is intimate with the details of the implementation and that the requirements are driven by the business.
In retrospect: - I think I should have stressed more demo's / workshopping with the Optimizely team at the start. I felt too confident during demo stages, and when came time to actually start, I was a bit lost. (The answer is likely I should have had them on-hand for our first install.. they offered but I thought I was OK.) - Really getting an understanding / asking them prior to install of how to make it really work for checkout pages / one that uses dynamic content or user interaction to determine what the UI does. Could have saved some time by addressing this at the beginning, as some things we needed to create on our site for Optimizely to "use" as a trigger for the variation test. - Having a number of planned/hoped-for tests already in-hand before working with Optimizely team. Sharing those thoughts with them would likely have started conversations on additional things we needed to do to make them work (rather than figuring that out during the actual builds). Since I had development time available, I could have added more things to the baseline installation since my developers were already "looking under the hood" of the site.
Google Analytics comes across more of a reporting tool whereas Adobe Analytics is more of an Enterprise level analytics tool. Contentsquare provides some traffic and flow capabilities but not to the same level as Adobe Analytics. However, Contentsquare's major advantage is its Zoning (Heatmapping), Impact Quantification and Find 'n' Fix modules; none of which are knowingly available in Adobe Analytics.
The ability to do A/B testing in Optimizely along with the associated statistical modelling and audience segmentation means it is a much better solution than using something like Google Analytics were a lot more effort is required to identify and isolate the specific data you need to confidently make changes
Adobe Analytics is relatively affordable compared to other tools, given it provides a range of flexible variables to use that I have not found in any other tools so far. It is worth investing in if your company is medium or large-sized and brings a steady flow of revenue. For small companies, it can be overpriced.
My organization uses Adobe Analytics across a multitude of brand portfolios. Each brand has multiple websites, mobile apps and some even have connected TV apps/channels on Roku and similar devices. Adobe can handle the multitude of properties that have simple, small(ish) websites and the larger brand properties that include web, mobile and connected TVs/OTT devices.
Each of those larger brands has multiple categories and channels to keep track of. We can see the data by channel/device or aggregate all the data together. This gives our executive teams the full picture and the departmental teams the view they need to see their own performance.
We can use it flexibly across lines of business and have it in use across two departments. We have different use cases and slightly different outcomes, but can unify our results based on impact to the bottom line. Finally, we can generate value from anywhere in the org for any stakeholders as needed.
The professional services team is one of the best teams for complex adobe analytics implementations, especially for clients having multiple website and mobile applications. However, the cost of professional services is a bit high which makes few clients opt out of it, but for large scale implementations they are very helpful
Adobe Analytics impacts nearly every aspect of a billion plus dollar revenue eCommerce business. From measuring the impact of new build features to marketing campaigns.
We are saving substantial money and resource effort by consolidating all of our properties to Adobe Analytics from alternative solutions, at which point we will finally be able to report on Total Digital, rather than disparate reports.
We support experimentation on every platform and the performance is only known through Adobe Analytics tagging.
We're able to share definitive annualized revenue projections with our team, showing what would happen if we put a test into Production
Showing the results of a test on a new page or feature prior to full implementation on a site saves developer time (if a test proves the new element doesn't deliver a significant improvement.
Making a change via the WYSIWYG interface allows us to see multiple changes without developer intervention.