About 8,120,000 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · First, install the Databricks Python SDK and configure authentication per the docs here. pip install databricks-sdk Then you can use the approach below to print out secret …

  2. Is there a way to use parameters in Databricks in SQL with …

    Sep 29, 2024 · The catalog and schema should dynamically change for DEV, UAT, and PROD environments. There is a lot of confusion wrt the use of parameters in SQL, but I see …

  3. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage …

  4. Databricks shared access mode limitations - Stack Overflow

    Oct 2, 2023 · You're correct about listed limitations. But when you're using Unity Catalog, especially with shared clusters, you need to think a bit differently than before. UC + shared …

  5. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done …

  6. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · In Azure Databricks, if you want to create a cluster, you need to have the " Can Manage " permission. This permission basically lets you handle everything related to clusters, …

  7. Can't authenticate deploy of Databricks bundle in Azure pipeline …

    Nov 3, 2023 · Issue Trying to deploy a Databricks bundle within an Azure pipeline. Databricks CLI = v0.209.0 Bundle artifact is downloaded to the vm correctly. Conducted via these instructions: …

  8. how to get databricks job id at the run time - Stack Overflow

    Jun 9, 2025 · I am trying to get the job id and run id of a databricks job dynamically and keep it on in the table with below code run_id = self.spark.conf.get ("spark.databricks.job.runId", "no_ru...

  9. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath …

  10. How to to trigger a Databricks job from another Databricks job?

    Aug 14, 2023 · Databricks is now rolling out the new functionality, called "Job as a Task" that allows to trigger another job as a task in a workflow. Documentation isn't updated yet, but you …