Databricks in San Francisco offers the Databricks Lakehouse Platform (formerly the Unified Analytics Platform), a data science platform and Apache Spark cluster manager. The Databricks Unified Data Service aims to provide a reliable and scalable platform for data pipelines, data lakes, and data platforms. Users can manage full data journey, to ingest, process, store, and expose data throughout an organization. Its Data Science Workspace is a collaborative environment for practitioners to run…
$0.07
Per DBU
Looker
Score 8.5 out of 10
N/A
Looker is a BI application with an analytics-oriented application server that sits on top of relational data stores. It includes an end-user interface for exploring data, a reusable development paradigm for data discovery, and an API for supporting data in other systems.
Medium to Large data throughput shops will benefit the most from Databricks Spark processing. Smaller use cases may find the barrier to entry a bit too high for casual use cases. Some of the overhead to kicking off a Spark compute job can actually lead to your workloads taking longer, but past a certain point the performance returns cannot be beat.
As a Sales, it suited me to use such well-developed software with a nice dashboard that could navigate between my prospects, visualize the numbers of clients, and check my achievements for each quarter. That would help me understand my performance and record the data I needed for myself as a salesperson.
Show visited pages - sessions, pageviews - which programs are viewed the most.
Displays session source/medium views to see where users are coming from.
It shows the video titles, URLs, and event counts so we can monitor the performance of our videos.
It gives a graphic face to the numbers, such as using bar charts, pie graphs, and other charts to show user trends or which channels are driving engagement.
Our clients like to see the top pages visited for a month.
I like the drop-and-drag approach, and building charts is a little easier than it was before.
Connect my local code in Visual code to my Databricks Lakehouse Platform cluster so I can run the code on the cluster. The old databricks-connect approach has many bugs and is hard to set up. The new Databricks Lakehouse Platform extension on Visual Code, doesn't allow the developers to debug their code line by line (only we can run the code).
Maybe have a specific Databricks Lakehouse Platform IDE that can be used by Databricks Lakehouse Platform users to develop locally.
Visualization in MLFLOW experiment can be enhanced
I give it this rating because it deems as effective, I am able to complete majority of my tasks using this app. It is very helpful when analyzing the data provided and shown in the app and it's just overall a great app for Operational use, despite the small hiccups it has (live data).
Because it is an amazing platform for designing experiments and delivering a deep dive analysis that requires execution of highly complex queries, as well as it allows to share the information and insights across the company with their shared workspaces, while keeping it secured.
in terms of graph generation and interaction it could improve their UI and UX
Looker is relatively easy to use, even as it is set up. The customers for the front-end only have issues with the initial setup for looker ml creations. Other "looks" are relatively easy to set up, depending on the ETL and the data which is coming into Looker on a regular basis.
Somehow resources heavy, both on server and client. I recommned at least 50Mbs data rate and high performance desktop comouter to be abke to run comolex tasks and configure larger amount of data. On the other hand, the client does not need to worry when viewing, the performance is usually ok
One of the best customer and technology support that I have ever experienced in my career. You pay for what you get and you get the Rolls Royce. It reminds me of the customer support of SAS in the 2000s when the tools were reaching some limits and their engineer wanted to know more about what we were doing, long before "data science" was even a name. Databricks truly embraces the partnership with their customer and help them on any given challenge.
Never had to work with support for issues. Any questions we had, they would respond promptly and clearly. The one-time setup was easy, by reading documentation. If the feature is not supported, they will add a feature request. In this case, LDAP support was requested over OKTA. They are looking into it.
The most important differentiating factor for Databricks Lakehouse Platform from these other platforms is support for ACID transactions and the time travel feature. Also, native integration with managed MLflow is a plus. EMR, Cloudera, and Hortonworks are not as optimized when it comes to Spark Job Execution. Other platforms need to be self-managed, which is another huge hassle.
Looker gives you options to integrate external APIs with great ease. Our data analytics team is able to easily use multiple data sources as input to the Looker dashboard, and everything is consolidated in one single Dashboard. You also have an option for Shared folders to be accessed by multiple people. The reporting system is perfect and has a wide range of options/reporting options that can be implemented.
Looker has a poignant impact on our business's ROI objectives. As an advertising exchange we have specific goals for daily requests and fill, and having premade Looks to monitor this is an integral piece of our operational capability
To facilitate an efficient monthly billing cycle in our organization, Looker is essential to track estimated revenue and impression delivery by publisher. Without the Looks we have set up, we would spend considerably more time and effort segmenting revenue by vertical.
Looker's unique value proposition is making analytical tools more digestible to people without conventional analytical experience. Other competing tools like Tableau require considerably more training and context to successfully use, and the ability to easily plot different visualizations is one of its greatest selling points.