Connecting Databricks SQL Database To MuleSoft

Author: Ashish Singh Chauhan

  1. You must have all the credentials provided by your client as the Databricks reside at the client end so you won’t be able to access it directly.
  1. Your public IP address must be unblocked by the client for accessing the Databricks database. This will help to test your application on a local machine.
Steps of connection:
  1. Add a listener to trigger your flow and configure it.
  1. Add a Database module to your Mule palette.

Mule palette → Add modules → Database Module and drop in the palette.

  1. Save connection details i.e Host, Port, Sql HTTP path, and Personal access token.
  1. For making connections download the databricks-jdbc jar file and pom file. For that:
  1. Go to global elements → create → Connector configuration → Database config → Select the generic connection.
  1. In configuring JDBC driver – use the local file – in File and pom, add the path to your downloaded jar and pom of jdbc.
  1. Configure Url as follows →

Replace your <host>, <port>, <http_path> and <personal_access_token> with credentials saved in step 3.

Driver class name – com.databricks.client.jdbc.Driver

Username and Password – Not required since you are using a personal access token.

After configuring the test connection.

Note – i) Here jdbc:databricks is used for integration purposes with MuleSoft. In general, jdbc:sparks is used.

 ii) Make sure your VPN is active before making the connection.

  1. Add a Select component in Mule flow and write your query in it.
  1. The Mule flow will be like below:
  1. Run your Mule application and hit the endpoint. You will see the data in your postman.

We use cookies on this site to enhance your user experience. For a complete overview of how we use cookies, please see our privacy policy.