Author: Ashish Singh Chauhan
Prerequisite:
- You must have all the credentials provided by your client as the Databricks reside at the client end so you won’t be able to access it directly.
- Your public IP address must be unblocked by the client for accessing the Databricks database. This will help to test your application on a local machine.
Steps of connection:
- Add a listener to trigger your flow and configure it.
- Add a Database module to your Mule palette.
Mule palette → Add modules → Database Module and drop in the palette.
- Save connection details i.e Host, Port, Sql HTTP path, and Personal access token.
- For making connections download the databricks-jdbc jar file and pom file. For that:
- Go to the link – https://search.maven.org/artifact/com.databricks/databricks-jdbc/2.6.25-1/jar
- Go to the top right corner in downloads → Download jar and pom from there.

- Go to global elements → create → Connector configuration → Database config → Select the generic connection.
- In configuring JDBC driver – use the local file – in File and pom, add the path to your downloaded jar and pom of jdbc.
- Configure Url as follows →
jdbc:databricks://<host>:<port>/default;transportMode=http;ssl=1;AuthMech=3;httpPath=<http_path>;UID=token;PWD=<personal_access_token>;
Replace your <host>, <port>, <http_path> and <personal_access_token> with credentials saved in step 3.
Driver class name – com.databricks.client.jdbc.Driver
Username and Password – Not required since you are using a personal access token.
After configuring the test connection.
Note – i) Here jdbc:databricks is used for integration purposes with MuleSoft. In general, jdbc:sparks is used.
ii) Make sure your VPN is active before making the connection.


- Add a Select component in Mule flow and write your query in it.

- The Mule flow will be like below:

- Run your Mule application and hit the endpoint. You will see the data in your postman.
