Authors: Ashish Singh Chauhan and Shubham Singh Panwar
What is ChatGpt?
ChatGPT is an artificial intelligence language model developed by OpenAI. It uses a deep learning algorithm called a transformer to generate human-like responses to text-based inputs. ChatGPT is capable of understanding and generating natural language, allowing it to converse with users on a wide range of topics. It has been trained on a vast amount of text data from the internet, books, and other sources, allowing it to generate responses that are relevant and informative.
Prerequisites for Integration in Mule
- An active OpenAI account is required. If you do not have an account then you can create an account here – https://chat.openai.com/chat
- You must have an API key that is used for integration.
For API key:
Go to https://platform.openai.com/overview → Personal(top right corner) → View API keys → Create secret key → Copy.


- Download the OpenAI connector from GitHub – https://github.com/djuang1/openai-connector
- Extract the zip → Open the command prompt in the same location → Do Maven clean install.
If you face any error then follow this blog – https://help.mulesoft.com/s/article/Maven-error-when-building-application-Blocked-Mirror-for-repositories
Chat Demo in Postman
- Create a Post request with the following URL – https://api.openai.com/v1/chat/completions
- In Authorization → Bearer token → paste the secret key.

- Create the body of the chat as follows ->
{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Who was CV raman?"
}
]
}
Pass your query in the content section of messages.
Note → Roles are of many types:
- system – it sets the behavior of the assistant.
- user – used for normal questions.
- assistant – to continue to follow-up questions on the response for user content.
- Hit the request and you will get a successful response.

Steps For Integration In Mule Using API
- Create a demo project in AnyPoint studio. Add an HTTP listener to trigger the flow.
- Add a transform message and set payload as follows:

Pass the role from query params and the question as payload.
Also, pass the token from the headers.

- Add Http requester. In the configuration:
Protocol – HTTPS
Host – api.openai.com
Port – 443
And in the path section pass – /v1/chat/completions

In the headers pass the variable in which you set the header.
- Run the application. Hit the request and you will get a successful response.

Steps For Integration In Mule Using Connector
- Create a demo project in AnyPoint studio. Add an HTTP listener to trigger the flow.
- Add the following dependency in pom for importing the open ai connector.
<dependency>
<groupId>com.mulesoft.platform_se</groupId>
<artifactId>openai</artifactId>
<version>1.0.12</version>
<classifier>mule-plugin</classifier>
</dependency>
- In the Mule palette, you will see the open AI connector like:

- Go to global elements → OpenAI Connector Config → under API key paste your secret key and do a test connection.


- Add a transform message and set the payload like:
%dw 2.0
output application/java
---
[{
"role": attributes.queryParams.role,
"content": payload.question
}]
Pass the role from query params and question in the payload body.
- Drag and drop create chat completions. In the model select “gpt-3.5-turbo” and in messages pass the payload.

- Add a transform message to convert the Java response into json response.
The complete flow is like this:

- Save and run the application. Hit the request and you will get the response like:

References
- Mysore MuleSoft meetup – https://www.youtube.com/watch?v=5G3ypxhlR_U&ab_channel=MuleSoftMysoreMeetUpGroup
- OpenAI API references – https://platform.openai.com/docs/api-reference/edits
Github repo for connector and readme file contains documentation – https://github.com/djuang1/openai-connector