Chatgpt Integration With MuleSoft Using API And Connector

Authors: Ashish Singh Chauhan and Shubham Singh Panwar

What is ChatGpt?

ChatGPT is an artificial intelligence language model developed by OpenAI. It uses a deep learning algorithm called a transformer to generate human-like responses to text-based inputs. ChatGPT is capable of understanding and generating natural language, allowing it to converse with users on a wide range of topics. It has been trained on a vast amount of text data from the internet, books, and other sources, allowing it to generate responses that are relevant and informative.

Prerequisites for Integration in Mule
  1. An active OpenAI account is required. If you do not have an account then you can create an account here – https://chat.openai.com/chat
  1. You must have an API key that is used for integration.

For API key:

Go to https://platform.openai.com/overview → Personal(top right corner) → View API keys → Create secret key → Copy.

  1. Download the OpenAI connector from GitHub – https://github.com/djuang1/openai-connector
  1. Extract the zip → Open the command prompt in the same location → Do Maven clean install.

If you face any error then follow this blog – https://help.mulesoft.com/s/article/Maven-error-when-building-application-Blocked-Mirror-for-repositories

Chat Demo in Postman
  1. Create a Post request with the following URL – https://api.openai.com/v1/chat/completions
  1. In Authorization → Bearer token → paste the secret key.
  1. Create the body of the chat as follows ->
{

"model": "gpt-3.5-turbo",

"messages": [

{

"role": "user",

"content": "Who was CV raman?"

}

]

}

Pass your query in the content section of messages.

Note → Roles are of many types:

  1. system – it sets the behavior of the assistant.
  2. user – used for normal questions.
  3. assistant – to continue to follow-up questions on the response for user content.
  1. Hit the request and you will get a successful response.
Steps For Integration In Mule Using API
  1. Create a demo project in AnyPoint studio. Add an HTTP listener to trigger the flow.
  2. Add a transform message and set payload as follows:

Pass the role from query params and the question as payload.

Also, pass the token from the headers.

  1. Add Http requester. In the configuration:

Protocol – HTTPS

Host – api.openai.com

Port – 443

And in the path section pass – /v1/chat/completions

In the headers pass the variable in which you set the header.

  1. Run the application. Hit the request and you will get a successful response. 
Steps For Integration In Mule Using Connector
  1. Create a demo project in AnyPoint studio. Add an HTTP listener to trigger the flow.
  2. Add the following dependency in pom for importing the open ai connector.
<dependency>

    <groupId>com.mulesoft.platform_se</groupId>

    <artifactId>openai</artifactId>

    <version>1.0.12</version>

    <classifier>mule-plugin</classifier>

</dependency>
  1. In the Mule palette, you will see the open AI connector like:
  1. Go to global elements → OpenAI Connector Config → under API key paste your secret key and do a test connection.
  1. Add a transform message and set the payload like:
%dw 2.0

output application/java

---

[{

   "role": attributes.queryParams.role,

   "content": payload.question

}]

Pass the role from query params and question in the payload body.

  1. Drag and drop create chat completions. In the model select “gpt-3.5-turbo” and in messages pass the payload.
  1. Add a transform message to convert the Java response into json response.

The complete flow is like this:

  1. Save and run the application. Hit the request and you will get the response like:
References
  1. Mysore MuleSoft meetup – https://www.youtube.com/watch?v=5G3ypxhlR_U&ab_channel=MuleSoftMysoreMeetUpGroup
  2. OpenAI API references – https://platform.openai.com/docs/api-reference/edits

Github repo for connector and readme file contains documentation – https://github.com/djuang1/openai-connector

We use cookies on this site to enhance your user experience. For a complete overview of how we use cookies, please see our privacy policy.