Skip to main content
Azure Synapse Analytics

Azure Synapse Analytics
Formerly Azure SQL Data Warehouse


What is Azure Synapse Analytics?

Azure Synapse Analytics is described as the former Azure SQL Data Warehouse, evolved, and as a limitless analytics service that brings together enterprise data warehousing and Big Data analytics. It gives users the freedom to query data using either serverless…

Read more
Recent Reviews

Modern Database

8 out of 10
August 25, 2021
We use Azure Synapse Analytics (Azure SQL Data Warehouse) to hold all our daily sales data to serve reports. Without any storage …
Continue reading
Read all reviews


Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards

Reviewer Pros & Cons

View all pros & cons
Return to navigation


View all pricing

Tier 1


per month 5,000 Synapse Commit Units (SCUs)

Tier 2


per month 10,000 Synapse Commit Units (SCUs)

Tier 3


per month 24,000 Synapse Commit Units (SCUs)

Entry-level set up fee?

  • No setup fee
For the latest information on pricing, visit…


  • Free Trial
  • Free/Freemium Version
  • Premium Consulting / Integration Services

Starting price (does not include set up fee)

  • $4,700 per month 5000 Synapse Commit Units (SCUs)
Return to navigation

Product Details

What is Azure Synapse Analytics?

Azure Synapse Analytics Technical Details

Deployment TypesSoftware as a Service (SaaS), Cloud, or Web-Based
Operating SystemsUnspecified
Mobile ApplicationNo

Frequently Asked Questions

Azure Synapse Analytics is described as the former Azure SQL Data Warehouse, evolved, and as a limitless analytics service that brings together enterprise data warehousing and Big Data analytics. It gives users the freedom to query data using either serverless or provisioned resources, at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate BI and machine learning needs.

Azure Synapse Analytics starts at $4700.

The most common users of Azure Synapse Analytics are from Enterprises (1,001+ employees).
Return to navigation


View all alternatives
Return to navigation

Reviews and Ratings


Attribute Ratings


(1-9 of 9)
Companies can't remove reviews or game the system. Here's why
Score 8 out of 10
Vetted Review
Verified User
Our data warehouse was growing at a 1TB/year rate, and we needed a solution that would be both cheap and effective.
Previously we were using Azure SQL Database with its JSON capabilities and various Azure serverless services to manage our data, but at that growth rate, time and cost were becoming limiting factors.
  • Build, schedule and monitor complex data pipelines (Azure Data Factory component)
  • Access your data lake using the familiar T-SQL syntax and TDS-enabled tools (SSMS, ADS, ...). This is especially useful for business people that are used to a specific workflow.
  • Support a wide range of data transformation tools, from low-code (DataFlows) to full-code (Spark), all integrated in a single central orchestrator (Azure Data Factory-like)
  • Provide all these services as a single very convenient package, without the need to know beforehand all the configuration behind
  • There's no support for Synapse Serverless objects (e.g., views) in SSDT - the VCS-friendly approach to schema deployments from Microsoft. SSDT is available for almost all other SQL Server and Azure SQL products, including Synapse Dedicated SQL Pools.
  • There are lots of ways to accomplish the same task, and it's not very clear which one is best suited for a given scenario other than trial and error. Also, some scenarios (e.g., efficient management of late arrivals) don't have a clear solution path.
  • I think it would be cool to have a tighter integration of the product with the Azure Data Studio client, not only for connecting to SQL Serverless or Dedicated Pools. For example, PySpark development and debugging would be much easier if done from ADS.
It's well suited for large, fastly growing, and frequently changing data warehouses (e.g., in startups). It's also suited for companies that want a single, relatively easy-to-use, centralized cloud service for all their data needs. Larger, more structured organizations could still benefit from this service by using Synapse Dedicated SQL Pools, knowing that costs will be much higher than other solutions.
I think this product is not suited for smaller, simpler workloads (where an Azure SQL Database and a Data Factory could be enough) or very large scenarios, where it may be better to build custom infrastructure.
Scott Kennedy | TrustRadius Reviewer
Score 8 out of 10
Vetted Review
Verified User
Microsoft Azure Synapse Analytics (formally Azure SQL Data Warehouse) is being used as our marketing data warehouse. We are pulling data down from a number of different API's such as Facebook ads, Google ads and Google analytics, and then pumping that information back into the Azure Synapse Analytics Warehouse on a daily basis.
  • They unify many data sources easily
  • There is some "code free" ETL work it enables
  • There is some AI integration that works nice
  • The cost structure is difficult to understand
  • The job scheduling capabilities aren't easy to use
  • The logging metrics aren't easy to see
Azure Synapse Analytics is very well suited for companies that are using the Microsoft Power BI analytics tool (business intelligence). The reason being, you don't need to provide a data gateway to move data from your database to the reporting service online if you are using this type of database. This is a huge win for processing data.
Score 8 out of 10
Vetted Review
Verified User
As a consulting company, we implement data warehouse solutions for our clients. We use Azure Synapse for enterprises data warehouse implementations. Data from various internal sources like sales, finance and operations are integrated into Synapse via Azure Data Factory and Data Lake. It’s used as reporting data source for Microsoft Power BI as well.
  • Data integration via poly base
  • Data distribution
  • Create table as select
  • Resource allocation via user groups (for production ETL and report users)
  • Integrating external 3rd party data sources is very easy in Snowflake and it’s missing in Azure Synapse
  • Master data services and data quality services are missing in Azure Synapse. They are useful features present in on Orem Sql server
  • Resource usage reports (top 10 expensive queries, most frequently run queries, etc) are a feature that can be added in Azure Synapse. It’s present in an on-prem SQL server. DMVs are there but viewing it visually as a report is more helpful.
Big Data load are made simple using polybase feature. You just have to create external tables to connect to any data source files (of any format) in Azure Data Lake. There is no need for map-reducing that is done in Hadoop clusters. You just need to know sql to do data integration.
Score 10 out of 10
Vetted Review
Verified User
We've been using Azure Synapse Analytics to create data pipelines for onPrem/onCloud ETL processing where the transform data will store inside the Azure Data Lake for further processing using PowerBI.
  • Create data pipelines to connect with multiple data workspace(s) and external data
  • Ability to connect with Azure Data Lake (sequentially) for data warehousing
  • Being able to manage connections and create integration runtimes (for onPrem data capture)
  • Thus far haven't seen any complications and/or any missing features
In terms of a well-suited scenario - the Azure Synapse can be used to capture data from multiple sources (especially from onPrem sources apart from Dataverse) and update the transformed data based on the given conditions (eg: refresh data based on the specified date/time ranges). Also, the transformed data can simply be transferred to Azure Data Lake for further processing by utilizing other analytics tools such as PowerBI.
Vladimír Mužný | TrustRadius Reviewer
Score 10 out of 10
Vetted Review
Verified User
I am an independent developer using Azure Synapse for other companies. My company is machinery production oriented (i.e. automotive companies) so I'm used to [utilizing] the Synapse for statistics-driven quality control, some logistics stuff, etc.
  • The combination of SQL/unstructured data
  • Keeping things "complicated, but simple"; [heterogeneous] data formats seen as just SQL tables to business experts used to use Power BI, Excel, and any other traditional SQL-oriented BI tools
  • Integration options using "Synapse pipelines", the application of ADFs
  • The greatly integrated solution of independent things (Spark MPP cluster, MPP SQL Servers, ADFs) - all sitting under one roof. Great job!
  • Integration with super-fast, globally replicated data. I really appreciate the integration of NoSQL databases (namely Core API and Mongo API under Cosmos DB) with purely batch-processed BI data
  • I have no idea right now. But... Synapse Analytics are typically seen as batch-processing of source data. What about tighter cooperation with streaming features like Event Hubs?
The most frequent answer to questions like this should be... IT DEPENDS. Synapse Analytics has some role in its DNA. It's not dedicated [to] tasks like some OLTP with many reads, however. When we are talking about Azure Synapse, we are talking about modernized BI stuff with great capabilities to involve big data processing to reach deeper insights [into] our data.
Score 8 out of 10
Vetted Review
Verified User
Azure Synapse Analytics is being used for data Warehousing - Azure Data Factory to pull in the initial data from source to Data Lake, then Spark notebook to process from raw (bronze) to staging (silver) in Synapse dedicated pools, then stored procedures in Synapse dedicated pool to process from staging to reporting (gold).
  • fast query results
  • integrated systems
  • one application/area for all processes
  • Delta Lake doesn't have full capabilities yet
  • spark doesn't yet have delta live tables
  • coding differences from Databricks' spark aren't well documented
Azure Synapse Analytics is well suited to Data Warehouse scenarios with large data tables because of its distributed computing. If most tables have fewer than 1 million rows, then the cost of Synapse is not worthwhile - regular Azure SQL or Azure Analysis Services could suffice. If most tables have more than 1 million rows, then it's worthwhile to get the additional speed for querying large data sets.
August 25, 2021

Modern Database

Score 8 out of 10
Vetted Review
Verified User
We use Azure Synapse Analytics (Azure SQL Data Warehouse) to hold all our daily sales data to serve reports. Without any storage constraint, we save large datasets and process them in a matter of time, thanks to the Azure lake storage support and Massive Parallel processing capabilities. It supports major file formats like Avro, Parquet and many more.
  • Easy to Manage data
  • Blazing fast query processing
  • Supports Modern fileformats
  • Documentation and Usecases
  • Pricing
  • Admin capabilities
Enterprises which require to manage huge datasets and need support to bigdata capabilities in a cost efficient way. Enterprises that process real-time data for their analysis like streaming data and IOT data. Combining Azure Synapse Analytics and Data lake storage provides a better performance and cost effective way to manage a huge dataset.
Samir Patel, PMP | TrustRadius Reviewer
Score 9 out of 10
Vetted Review
Verified User
SQL Data Warehouse is being used to hold all of our summary level reporting. The data is loaded using SSIS and transformed into a star schema. SQL does a great job mapping all of the OLAP values and providing efficient structures to house all of the reporting data. We then use a reporting tool to build cubes and publish the data
  • It is very cost-effective
  • Development time needed was much less in comparison to other systems
  • Played very nicely with our ETL and OLAP reporting tools
  • More features would be a plus
  • I would like to see Microsoft offer some diagramming tools for data warehouse
  • I believe processing time and speed could always be improved

SQL Data Warehouse is always well suited in a Microsoft SQL environment. When you are using tools like SSIS, SSAS and SSRS, SQL Data Warehouse fits in nicely as the OLAP backend.

Some challenges faced for this product are in very large expansive environments where the transact databases might be coming from different sources like Oracle or Sybase.

Score 9 out of 10
Vetted Review
Verified User
We use it to store large amounts of SQL data for our predictive analytics and big data modeling. We use it across several team but I cannot say it is used for the entire organization as my department operates relatively independently of the rest of the organization. We have an extremely large data sets and need to store it in a way that makes it accessible and fast.
  • Quick to return data. Queries in a SQL data warehouse architecture tend to return data much more quickly than a OLTP setup. Especially with columnar indexes.
  • Ability to manage extremely large SQL tables. Our databases contain billions of records. This would be unwieldy without a proper SQL datawarehouse
  • Backup and replication. Because we're already using SQL, moving the data to a datawarehouse makes it easier to manage as our users are already familiar with SQL.
  • It takes some time to setup a proper SQL Datawarehouse architecture. Without proper SSIS/automation scripts, this can be a very daunting task.
  • It takes a lot of foresight when designing a Data Warehouse. If not properly designed, it can be very troublesome to use and/or modify later on.
  • It takes a lot of effort to maintain. Businesses are continually changing. With that, a full time staff member or more will be required to maintain the SQL Data Warehouse.
It is very well suited for big data analytics. Predictive modeling, optimization, and other large scale analysis benefit from using a properly defined SQL Data Warehouse. It is also suited for simple business intelligence such as building historical and active dashboards.
Return to navigation