Announcing CARTO’s Spatial Extension for Databricks — Powering Geospatial Analysis for JLL
This is a collaborative post by Databricks and CARTO. We thank Javier de la Torre, Founder and Chief Strategy Officer at CARTO for his contributions. Today, CARTO is announcing the beta launch of their new product called the Spatial Extension for Databricks, which provides a simple installation and seamless integration with the Databricks Lakehouse...
Introduction to Databricks and PySpark for SAS Developers
This is a collaborative post between Databricks and WiseWithData. We thank Founder and President Ian J. Ghent, Head of Pre-Sales Solutions R&D Bryan Chuinkam, and Head of Migration Solutions R&D Ban (Mike) Sun of WiseWithData for their contributions. Technology has come a long way since the days of SAS®-driven data and analytics workloads. The...
Deploying dbt on Databricks Just Got Even Simpler
At Databricks, nothing makes us happier than making our users more productive, which is why we are delighted to announce a native adapter for dbt. It’s now easier than ever to develop robust data pipelines on Databricks using SQL. dbt is a popular open source tool that lets a new breed of ‘analytics engineer’ build...
Building Analytics on the Lakehouse Using Tableau With Databricks Partner Connect
This is a guest authored post by Madeleine Corneli, Sr. Product Manager, Tableau. On November 18, Databricks announced Partner Connect, an ecosystem of pre-integrated partners that allows customers to discover and connect data, analytics and AI tools to their lakehouse. Tableau is excited to be among a set of launch partners to be featured in...
Build Your Business on Databricks With Partner Connect
At Databricks we believe that to create the ultimate customer experience, we must leverage the work of more than just our employees and create a platform others can extend. To see the importance of this, think of the apps on your phone. Were they all made by Apple or Google? How much less valuable would...
Now Generally Available: Introducing Databricks Partner Connect to Discover and Connect Popular Data and AI Tools to the Lakehouse
Databricks is thrilled to announce Partner Connect, a one-stop portal for customers to quickly discover a broad set of validated data, analytics, and AI tools and easily integrate them with their Databricks lakehouse across multiple cloud providers. Partner Connect makes it easy for customers to integrate validated data, analytics, and AI tools directly within their...
Accenture and Databricks Lakehouse Accelerate Digital Transformation
This is a collaborative post from Accenture and Databricks. We thank Matt Arellano, Managing Director, Global Data & AI Ecosystem Lead -- Accenture, for his contributions. To keep pace with the competition and address customer demands, companies are looking to quickly bring new capabilities to the market, boost innovation and scale more efficiently. Customers...
Curating More Inclusive and Safer Online Communities With Databricks and Labelbox
This is a guest authored post by JT Vega, Support Engineering Manager, Labelbox. While video games and digital content are a source of entertainment, connecting with others, and fun for many around the world, they are also frequently a destination for toxic behavior that can include flaming, trolling, cyberbullying, and hate speech in the form...
5 Steps to Get Started With Databricks on Google Cloud
Since we launched Databricks on Google Cloud earlier this year, we’ve been thrilled to see stories about the value this joint solution has brought to data teams across the globe. One of our favorite quotes is from Douglas Mettenburg, Vice President Analytics at J. B. Hunt: “Ultimately, Databricks on Google Cloud is now the source...
Introducing Support for gp3, Amazon’s New General Purpose SSD Volume
Databricks clusters on AWS now support gp3 volumes, the latest generation of Amazon Elastic Block Storage (EBS) general purpose SSDs. gp3 volumes offer consistent performance, cost savings and the ability to configure the volume’s iops, throughput and volume size separately. Databricks on AWS customers can now easily switch to gp3 for the better price/performance storage...