Get a firsthand look at Databricks from the practitioner’s perspective with these simple on-demand videos. Each of the demos below is paired with related materials — including notebooks, videos and eBooks — so that you can try it out for yourself on Databricks.
In this demo, we walk through a high-level overview of the Databricks Lakehouse platform, including discussion of how open source projects including Apache SparkTM, Delta Lake, MLflow, and Koalas, fit into the Databricks ecosystem. We then cover the Data Science Workspace, launching Spark clusters, and collaborative notebook features, before shifting our focus to Delta Lake, time travel, and Databricks SQL.
In this demo, we walk through some of the features of the new Databricks SQL that are important to data analysts, including the integrated data browser, SQL query editor with live autocomplete, built-in data visualization tools, and flexible dashboarding and alerting capabilities. We also cover how Databricks SQL endpoints provide a high-performance, low latency, SQL-optimized compute resource that can power your existing BI tools like PowerBI and Tableau.
In this demo, we walk through a real-world data science and machine learning use case on Databricks, showing how different members of the data team can interact and collaborate on the Databricks platform. We also show how MLflow on Databricks simplifies and streamlines the end-to-end machine learning life cycle.
With Delta Lake on Databricks, you can build a lakehouse architecture that combines the best parts of data lakes and data warehouses. This simple, open platform both stores and manages all of your data and supports all of your analytics and AI use cases. In this demo, we cover the main features of Delta Lake, including unified batch and streaming data processing, schema enforcement and evolution, time travel, and support for UPDATEs/MERGEs/DELETEs, as well as touching on some of the performance enhancements available with Delta Lake on Databricks.
In this demo, we give you a first look at Delta Live Tables, a cloud service that makes reliable ETL – extract, transform and load capabilities – easy on Delta Lake. It helps data engineering teams simplify ETL development with a simple UI and declarative tooling, improve data reliability through defined data quality rules and bad data monitoring, and scale operations with deep visibility through an event log.
With Databricks Auto Loader, you can incrementally and efficiently ingest new batch and real-time streaming data files into your Delta Lake tables as soon as they arrive — so that they always contain the most complete and up-to-date data available. SQL users can use the simple “COPY INTO” command to pull new data into their Delta Lake tables automatically, without the need to keep track of which files have already been processed.
The Azure Databricks Lakehouse platform gives you the best of data lakes and data warehouses, on a simple, open, and collaborative platform that securely integrates with your existing Azure services. In this demo, we cover several of the most common Azure Databricks integrations, including Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), Azure IoT Hub, Azure Synapse Analytics, Power BI and more.
Databricks runs on AWS and integrates with all of the major services you use like S3, EC2, Redshift, and more. We provide the platform that enables you to combine all of these services to build a lakehouse architecture. In this demo, we’ll show you how Databricks integrates with each of these services simply and seamlessly.
In this solution accelerator, we demonstrate how to use the Databricks Lakehouse Platform to better understand and quantify the holistic ESG impact of any investment in a company or business in order to generate alpha, mitigate reputation risk, and maintain the trust of both clients and shareholders.
In this solution accelerator, we demonstrate how to use Apache Spark™ and Facebook Prophet™ to build dozens of time series forecasting models in parallel on the Databricks Lakehouse Platform.
Identify your most valuable and longest lifetime customers. Find out where to prioritize resources and where to limit spending on unprofitable customers — to help improve the ROI of marketing programs.