Hi, I’m Denis Gontcharov.
I work as a Business Development Representative at Databricks in Amsterdam. I write a newsletter about transitioning from engineering into tech sales at Databricks.
Hi, I’m Denis Gontcharov.
I work as a Business Development Representative at Databricks in Amsterdam. I write a newsletter about transitioning from engineering into tech sales at Databricks.
In this episode, Denis sits down with Jim Gavigan, founder of Industrial Insight, to discuss Datatude, a framework for measuring your organization’s readiness to leverage industrial data effectively. About the Guest: Jim Gavigan brings 30 years of experience in industrial manufacturing, from vibration analysis and control systems to working at Rockwell Automation and OSIsoft. He founded Industrial Insight in 2016 to help companies maximize the value of their time series data. ...
This time I had the pleasure of inviting Serena Delli to discuss how she deployed AI Agents at Bludigit - Italgas to help both IT and business people with troubleshooting and resolving operational tickets. Is your team getting buried under a pile of ServiceNow tickets with fuzzy descriptions and unclear objectives? Find out how you can automate some of that mind numbing work! My favorite notes Sometimes the agent can solve the problem so that no ServiceNow ticket needs to be created at all. A general agent “general physician” forwards tough requests to a specialized agent “cardiologist”. Hosting everything on Databricks allowed Italgas to double the rate of projects they deliver while reducing their cloud bill.
Video Objectives In this post we will deploy a Databricks Asset Bundle or DAB from a Git repository hosted on Azure DevOps using Azure DevOps pipelines. In summary, we will learn how to: Grant Databricks access to your Azure DevOps Git repository. Define a simple DAB that deploys a Databricks notebook. Learn how to use the Databricks CLI to validate and deploy DABs. Write a Azure DevOps pipeline to deploy this DAB. Pass parameters from the DAB into the Databricks notebook. Concerning the last point, it’s not uncommon that your code differs slightly in each Databricks environment (dev, test, prod). For example, you may have an Azure key vault my_key_vault_dev for the development workspace and my_key_vault_prod for the production workspace. We will see how to pass this workspace-dependent data from the DAB to Databricks Notebooks via widgets. ...
In this episode of the Watts in Your Data Podcast, I talk with with John Walmsley of Aluminate Technologies, about what AI actually does in heavy industry today, cutting through the hype to explore real applications and challenges. John brings experience from semiconductors to medical devices to AI in heavy industry. The conversation covers three levels of industrial AI: continuous monitoring, multi-sensor analysis, and autonomous optimization. Using aluminum industry examples, we explore why AI projects get stuck in pilot phase and what it takes to scale solutions enterprise-wide. ...
In this new video I share how to overcome Azure CPU quota limits with Databricks Asset Bundles, a common roadblock many Databricks practitioners face when deploying Databricks Asset Bundles on Azure for the first time. Problem If you’re playing around with Databricks projects, Azure’s default CPU quota limits often fall short of what Databricks Asset Bundle Python template jobs and pipelines actually need to run. ...