Azure Data Lake simplifies extensive data analysis. It runs Hadoop, HDInsight, and Data Lakes, and even complex queries run smoothly and quickly. We write queries to transform data and extract insights instead of configuring hardware. It can handle any size job by adjusting the …
Azure Data Lake Analytics services are beneficial when working with a lot of data. It can process enormous amounts of data extremely quickly. Service is secure and easy to set up, build, scale, and run on Azure. Regarding big data analytics and reporting, parallel processing has a significant impact. It consolidated our analytics from multiple systems and increased our analysis productivity. This tool has excellent support for reporting tools like Power BI and is very quick when performing analytics.
Event-based data can be captured seamlessly from our data layers (and exported to Google BigQuery). When events like page-views, clicks, add-to-cart are tracked, Google BigQuery can help efficiently with running queries to observe patterns in user behaviour. That intermediate step of trying to "untangle" event data is resolved by Google BigQuery. A scenario where it could possibly be less appropriate is when analysing "granular" details (like small changes to a database happening very frequently).
GSheet data can be linked to a BigQuery table and the data in that sheet is ingested in realtime into BigQuery. It's a live 'sync' which means it supports insertions, deletions, and alterations. The only limitation here is the schema'; this remains static once the table is created.
Seamless integration with other GCP products.
A simple pipeline might look like this:-
GForms -> GSheets -> BigQuery -> Looker
It all links up really well and with ease.
One instance holds many projects.
Separating data into datamarts or datameshes is really easy in BigQuery, since one BigQuery instance can hold multiple projects; which are isolated collections of datasets.
There's a bit of bias towards cloud with ADL Analytics. Depending upon a company's infra strategy and investment plans, there are some challenges with migration and integeration.
Not worth the time/effort/money if the organization doesn't have "Volume" of data. Cost effective only when daily loads exceed around 1million.
While training materials are available online, Adoption rate - Yet to pick up.
Please expand the availability of documentation, tutorials, and community forums to provide developers with comprehensive support and guidance on using Google BigQuery effectively for their projects.
If possible, simplify the pricing model and provide clearer cost breakdowns to help users understand and plan for expenses when using Google BigQuery. Also, some cost reduction is welcome.
It still misses the process of importing data into Google BigQuery. Probably, by improving compatibility with different data formats and sources and reducing the complexity of data ingestion workflows, it can be made to work.
We have to use this product as its a 3rd party supplier choice to utilise this product for their data side backend so will not be likely we will move away from this product in the future unless the 3rd party supplier decides to change data vendors.
I think overall it is easy to use. I haven't done anything from the development side but an more of an end user of reporting tables built in Google BigQuery. I connect data visualization tools like Tableau or Power BI to the BigQuery reporting tables to analyze trends and create complex dashboards.
I have never had any significant issues with Google Big Query. It always seems to be up and running properly when I need it. I cannot recall any times where I received any kind of application errors or unplanned outages. If there were any they were resolved quickly by my IT team so I didn't notice them.
I think Google Big Query's performance is in the acceptable range. Sometimes larger datasets are somewhat sluggish to load but for most of our applications it performs at a reasonable speed. We do have some reports that include a lot of complex calculations and others that run on granular store level data that so sometimes take a bit longer to load which can be frustrating.
BigQuery can be difficult to support because it is so solid as a product. Many of the issues you will see are related to your own data sets, however you may see issues importing data and managing jobs. If this occurs, it can be a challenge to get to speak to the correct person who can help you.
We did some research about Alibaba Cloud Data Lake Analytics and even being cheaper than Azure Data Lake Analytics, we decided to go for the second one once we noticed they have more features and better documentation. Another thing we considered during this process was the fact that we have more people that already have Azure Cloud knowledge.
PowerBI can connect to GA4 for example but the data processing is more complicated and it takes longer to create dashboards. Azure is great once the data import has been configured but it's not an easy task for small businesses as it is with BigQuery.
We have continued to expand out use of Google Big Query over the years. I'd say its flexibility and scalability is actually quite good. It also integrates well with other tools like Tableau and Power BI. It has served the needs of multiple data sources across multiple departments within my company.
Google Support has kindly provide individual support and consultants to assist with the integration work. In the circumstance where the consultants are not present to support with the work, Google Support Helpline will always be available to answer to the queries without having to wait for more than 3 days.
Previously, running complex queries on our on-premise data warehouse could take hours. Google BigQuery processes the same queries in minutes. We estimate it saves our team at least 25% of their time.
We can target our marketing campaigns very easily and understand our customer behaviour. It lets us personalize marketing campaigns and product recommendations and experience at least a 20% improvement in overall campaign performance.
Now, we only pay for the resources we use. Saved $1 million annually on data infrastructure and data storage costs compared to our previous solution.