Event-based data can be captured seamlessly from our data layers (and exported to Google BigQuery). When events like page-views, clicks, add-to-cart are tracked, Google BigQuery can help efficiently with running queries to observe patterns in user behaviour. That intermediate step of trying to "untangle" event data is resolved by Google BigQuery. A scenario where it could possibly be less appropriate is when analysing "granular" details (like small changes to a database happening very frequently).
Ideal if the company is already a Microsoft shop, so chances are that it is free with SQL Server. Also, good for moving data between on premise systems. Not ideal for moving data to the cloud. No functionality out of the box to work with REST APIs. Stable product but definitely not the future
GSheet data can be linked to a BigQuery table and the data in that sheet is ingested in realtime into BigQuery. It's a live 'sync' which means it supports insertions, deletions, and alterations. The only limitation here is the schema'; this remains static once the table is created.
Seamless integration with other GCP products.
A simple pipeline might look like this:-
GForms -> GSheets -> BigQuery -> Looker
It all links up really well and with ease.
One instance holds many projects.
Separating data into datamarts or datameshes is really easy in BigQuery, since one BigQuery instance can hold multiple projects; which are isolated collections of datasets.
Please expand the availability of documentation, tutorials, and community forums to provide developers with comprehensive support and guidance on using Google BigQuery effectively for their projects.
If possible, simplify the pricing model and provide clearer cost breakdowns to help users understand and plan for expenses when using Google BigQuery. Also, some cost reduction is welcome.
It still misses the process of importing data into Google BigQuery. Probably, by improving compatibility with different data formats and sources and reducing the complexity of data ingestion workflows, it can be made to work.
Connection managers for online data sources can be tricky to configure.
Performance tuning is an art form and trialing different data flow task options can be cumbersome. SSIS can do a better job of providing performance data including historical for monitoring.
Mapping destination using OLE DB command is difficult as destination columns are unnamed.
Excel or flat file connections are limited by version and type.
We have to use this product as its a 3rd party supplier choice to utilise this product for their data side backend so will not be likely we will move away from this product in the future unless the 3rd party supplier decides to change data vendors.
Some features should be revised or improved, some tools (using it with Visual Studio) of the toolbox should be less schematic and somewhat more flexible. Using for example, the CSV data import is still very old-fashioned and if the data format changes it requires a bit of manual labor to accept the new data structure
I think overall it is easy to use. I haven't done anything from the development side but an more of an end user of reporting tables built in Google BigQuery. I connect data visualization tools like Tableau or Power BI to the BigQuery reporting tables to analyze trends and create complex dashboards.
SSIS is a great tool for most ETL needs. It has the 90% (or more) use cases covered and even in many of the use cases where it is not ideal SSIS can be extended via a .NET language to do the job well in a supportable way for almost any performance workload.
I have never had any significant issues with Google Big Query. It always seems to be up and running properly when I need it. I cannot recall any times where I received any kind of application errors or unplanned outages. If there were any they were resolved quickly by my IT team so I didn't notice them.
I think Google Big Query's performance is in the acceptable range. Sometimes larger datasets are somewhat sluggish to load but for most of our applications it performs at a reasonable speed. We do have some reports that include a lot of complex calculations and others that run on granular store level data that so sometimes take a bit longer to load which can be frustrating.
SQL Server Integration Services performance is dependent directly upon the resources provided to the system. In our environment, we allocated 6 nodes of 4 CPUs, 64GB each, running in parallel. Unfortunately, we had to ramp-up to such a robust environment to get the performance to where we needed it. Most of the reports are completed in a reasonable timeframe. However, in the case of slow running reports, it is often difficult if not impossible to cancel the report without killing the report instance or stopping the service.
BigQuery can be difficult to support because it is so solid as a product. Many of the issues you will see are related to your own data sets, however you may see issues importing data and managing jobs. If this occurs, it can be a challenge to get to speak to the correct person who can help you.
The support, when necessary, is excellent. But beyond that, it is very rarely necessary because the user community is so large, vibrant and knowledgable, a simple Google query or forum question can answer almost everything you want to know. You can also get prewritten script tasks with a variety of functionality that saves a lot of time.
The implementation may be different in each case, it is important to properly analyze all the existing infrastructure to understand the kind of work needed, the type of software used and the compatibility between these, the features that you want to exploit, to understand what is possible and which ones require integration with third-party tools
PowerBI can connect to GA4 for example but the data processing is more complicated and it takes longer to create dashboards. Azure is great once the data import has been configured but it's not an easy task for small businesses as it is with BigQuery.
I had nothing to do with the choice or install. I assume it was made because it's easy to integrate with our SQL Server environment and free. I'm not sure of any other enterprise level solution that would solve this problem, but I would likely have approached it with traditional scripting. Comparably free, but my own familiarity with trad scripts would be my final deciding factor. Perhaps with some further training on SSIS I would have a different answer.
We have continued to expand out use of Google Big Query over the years. I'd say its flexibility and scalability is actually quite good. It also integrates well with other tools like Tableau and Power BI. It has served the needs of multiple data sources across multiple departments within my company.
Google Support has kindly provide individual support and consultants to assist with the integration work. In the circumstance where the consultants are not present to support with the work, Google Support Helpline will always be available to answer to the queries without having to wait for more than 3 days.
Previously, running complex queries on our on-premise data warehouse could take hours. Google BigQuery processes the same queries in minutes. We estimate it saves our team at least 25% of their time.
We can target our marketing campaigns very easily and understand our customer behaviour. It lets us personalize marketing campaigns and product recommendations and experience at least a 20% improvement in overall campaign performance.
Now, we only pay for the resources we use. Saved $1 million annually on data infrastructure and data storage costs compared to our previous solution.