erwin Data Modeler by Quest is a data modeling tool used to find, visualize, design, deploy and standardize high-quality enterprise data assets. It can discover and document any data from anywhere for consistency, clarity and artifact reuse across large-scale data integration, master data management, metadata management, Big Data, business intelligence and analytics initiatives, accomplishing this whil esupporting data governance and intelligence efforts.
N/A
Tableau Desktop
Score 8.3 out of 10
N/A
Tableau Desktop is a data visualization product from Tableau. It connects to a variety of data sources for combining disparate data sources without coding. It provides tools for discovering patterns and insights, data calculations, forecasts, and statistical summaries and visual storytelling.
I have had a chance to use few other data modeling tools from Quest and Oracle, but I am most comfortable using erwin Data Modeler. They understand your data modeling needs and have designed the software to give you a feeling of completeness when you are designing a data model.
Tableau Desktop is one the finest tool available in the market with such a wide range of capabilities in its suite that makes it easy to generate insights. Further, if optimally designed, then its reports are fairly simple to understand, yet capable enough to make changes at the required levels. One can create a variety of visualizations as required by the business or the clients. The data pipelines in the backend are very robust. The tableau desktop also provides options to develop the reports in developer mode, which is one of the finest features to embed and execute even the most complex possible logic. It's easier to operate, simple to navigate, and fluent to understand by the users.
Reverse Engineering: I love the way we can import an SQL file containing schema meta data and generate ER diagram out of it. This is specifically useful if you are implementing erwin Data Modeler for an existing database.
Forward Engineering: We use this feature very frequently. Where we do database changes in our physical and logical data models and then generate deployment scripts for the changes made.
Physical vs Logical Models: I like to have my database model split into physical and logical models and at the same time still linked to each other. Any changes you make to logical model or physical model shows up in the other.
An excellent tool for data visualization, it presents information in an appealing visual format—an exceptional platform for storing and analyzing data in any size organization.
Through interactive parameters, it enables real-time interaction with the user and is easy to learn and get support from the community.
Our use of Tableau Desktop is still fairly low, and will continue over time. The only real concern is around cost of the licenses, and I have mentioned this to Tableau and fully expect the development of more sensible models for our industry. This will remove any impediment to expansion of our use.
I had a lot of experience using erwin Data Modeler for designing data models. I think it's pretty intuitive and easy to use. It has enough features to represent your database requirements in form of a model.
Tableau Desktop has proven to be a lifesaver in many situations. Once we've completed the initial setup, it's simple to use. It has all of the features we need to quickly and efficiently synthesize our data. Tableau Desktop has advanced capabilities to improve our company's data structure and enable self-service for our employees.
When used as a stand-alone tool, Tableau Desktop has unlimited uptime, which is always nice. When used in conjunction with Tableau Server, this tool has as much uptime as your server admins are willing to give it. All in all, I've never had an issue with Tableau's availability.
Tableau Desktop's performance is solid. You can really dig into a large dataset in the form of a spreadsheet, and it exhibits similarly good performance when accessing a moderately sized Oracle database. I noticed that with Tableau Desktop 9.3, the performance using a spreadsheet started to slow around 75K rows by about 60 columns. This was easily remedied by creating an extract and pushing it to Tableau Server, where performance went to lightning fast
CA customer support and our account manager have been able to support us with any issues that we have had, from managing our serial keys to issues we logged tickets to resolve. There are aspects of key management that have made it difficult over the years but support usually has worked with us.
I have never really used support much, to be honest. I think the support is not as user-friendly to search and use it. I did have an encounter with them once and it required a bit of going back and forth for licensing before reaching a resolution. They did solve my issue though
It is admittedly hard to train a group of people with disparate levels of ability coming in, but the software is so easy to use that this is not a huge problem; anyone who can follow simple instructions can catch up pretty quickly.
The training for new users are quite good because it covers topic wise training and the best part was that it also had video tutorials which are very helpful
Again, training is the key and the company provides a lot of example videos that will help users discover use cases that will greatly assist their creation of original visualizations. As with any new software tool, productivity will decline for a period. In the case of Tableau, the decline period is short and the later gains are well worth it.
Not listed, but I've only used alternatives built into something like the Squirrel SQL editor. That one is semi-functional but lacking many features and, in some instances, just plain wrong. The only pro there is that it's freely available and works over ODBC. I've tried some of the other free ones like Creately but didn't have much success.
If we do not have legacy tools which have already been set up, I would switch the visualization method to open source software via PyCharm, Atom, and Visual Studio IDE. These IDEs cannot directly help you to visualize the data but you can use many python packages to do so through these IDEs.
Tableau Desktop's scaleability is really limited to the scale of your back-end data systems. If you want to pull down an extract and work quickly in-memory, in my application it scaled to a few tens of millions of rows using the in-memory engine. But it's really only limited by your back-end data store if you have or are willing to invest in an optimized SQL store or purpose-built query engine like Veritca or Netezza or something similar.
Tableau was acquired years ago, and has provided good value with the content created.
Ongoing maintenance costs for the platform, both to maintain desktop and server licensing has made the continuing value questionable when compared to other offerings in the marketplace.
Users have largely been satisfied with the content, but not with the overall performance. This is due to a combination of factors including the performance of the Tableau engines as well as development deficiencies.