- Python 3.6 or newer. If your operating system doesn't provide a Python interpreter, you can go topython.orgto download an installer.
- A smartphone that can send and receive text messages.
- A Twilio account. If you're new to TwilioClick here to create a free accountnow and get a $10 credit when you upgrade to a paid account. You can check thoseFeatures and limitations of a free Twilio account.
- andopen AIAPI key. Request beta accesshere.
- ThatOpenAI Python Client Libraryto send requests to OpenAIGPT-3Motor.
- ThatTwilio Python Helper Libraryto work with SMS messages.
- ThatBottleFramework to create the web application.
- Thatpython-dotenvPackage to read a configuration file.
- ThatpygrokPackage to make our web application temporarily available on the internet.
timely
: the input textMotor-
: OpenAI has provided four text-completion engines, calleddavinci
,ada
,baba
jCurie
🇧🇷 we usedavinci
, who is the ablest of the four.Stop
: As I mentioned earlier, the GPT-3 engine doesn't really understand text. So when she fills in the text, she needs to know when to stop. When stoppingHuman:
We tell the engine to only output text for the line beginning withTO:
. Without a stop marker, GPT-3 would continue to generate text by typing more lines for both the user and the AI.Temperature
: A number between 0 and 1 that determines how much creative risk the engine takes when generating text.top_p
: an alternative way to control the originality and creativity of the generated text.frequency_penalty
: a number between 0 and 1. The higher this value, the more the model will try not to repeat itself.attendance penalty
: a number between 0 and 1. The higher this value, the more effort the model will put into talking about new topics.max_tokens
: maximum final length.-
Miguel Grinberg
-
Matt Police
-
diane phan
ForMiguel Grinberg 03.08.2020
A GPT-3 chatbot is a software application that can carry on a conversation with a human user through written or spoken language. The level of "intelligence" in chatbots varies greatly. While some chatbots have a fairly basic understanding of language, others use sophisticated artificial intelligence (AI) and machine learning (ML) algorithms to achieve near-human-level conversation.
In this tutorial, I'll show you how easy it is to create a chatbot for Twilio SMS using the OpenAI platform and Flask framework for Python.
Want to get an idea of the capabilities of GPT-3? Take a look at the following interaction I had with this bot using the project we're going to build in this tutorial:
Introduction to GPT-3
GPT-3(Generative Pre-trained Transformer 3) is a sophisticated language model trained on a very large corpus of text. Despite its internal complexity, it's surprisingly easy to use: you give it some text, and the model generates something else, following a similar style and structure.
What do you think of the first paragraph of this article? Would you believe me if I told you that I didn't write it myself and that it was generated entirely by GPT-3? I gave the model the title and introduction from two other blog posts I've written in the past, and then added the title of this article. Here is the full entry I provided:
Title: Build a WhatsApp Chatbot Using Python, Flask, and TwilioSummary: A chatbot is a software application that can carry on a conversation with a human user through spoken or written language. The level of "intelligence" in chatbots varies greatly. While some chatbots have a fairly basic understanding of language, others use sophisticated artificial intelligence (AI) and machine learning (ML) algorithms to achieve near-human-level conversation. In this tutorial, I'll show you how easy it is to create a chatbot for WhatsApp using the Twilio API for WhatsApp and the Flask framework for Python. Title: Run Flask's regularly scheduled tasks with Cron Summary: A common requirement for web applications is that a task runs periodically in the background. This could be a task that imports new data from third-party sources, or perhaps one that removes revoked tokens from your database once they expire. In this and many other situations, you face the challenge of implementing a task that runs in the background at regular intervals. That's a pattern that a lot of people ask me about. I've seen implementations based on the APScheduler package, on Celery, and even on proprietary solutions built in a background thread. Unfortunately, none of these options are very good. In this article, I'll show you what I think is a very robust implementation based on Flask CLI and Cron service. Title: Building an SMS Chatbot Using OpenAI's GPT-3 Engine, Twilio, and PythonSummary:
Note that the first two blocks of text contain "Title" and "Summary" lines, but the third entry contains an empty summary. When the GPT-3 engine looks at this entry, it understands that it needs to fill the "Abstract" line with the appropriate text for the specified title. In my opinion, it turned out pretty well!
GPT-3 is non-deterministic in the sense that different executions of the engine return different responses given the same input. In fact, with the entry above, I generated some introductory paragraphs and chose the one I liked the most to start the article. Here are a few more that deserve to be featured:
Summary: I recently read about GPT-3 and was surprised by how advanced this chatbot is. After reading the article, I thought: I want to do this, but I don't have access to a GPU. Luckily there are some projects that allow you to use a neural network in a non-graphical environment and one of them is OpenAI. In this article I will show you how to create a GPT-3 based chatbot using OpenAI, Twilio and Python.
Summary: In this article, I will show you how to create a simple chatbot using the GPT-3 algorithm and the Twilio SMS service. The algorithm reads and responds to text messages, and we use OpenAI's GPT-3 Python wrapper to do this.
Hard to believe, right? As surprising as it may seem, you should be careful as the engine doesn't fully understand the text it generates and may provide misleading answers. Consider these two other introductory paragraphs that he also pulled out of my post:
Summary: GPT-3 is a language modeling algorithm developed by Google. It is used to predict the next word in a sequence of words. This is a common problem in natural language processing and text generation. In this article, I'll show you how to create a chatbot that uses GPT-3 to generate replies to your messages. The chatbot will be able to answer questions about itself, the weather, and other topics.
Summary: In this post, I will teach you how to create a super cool GPT-3 based chatbot using the OpenAI platform. GPT-3 (Generative Preachers Talk 3) is a neural network that can generate new sermons for the original Preacher's Talks 3 dataset. With just this data and a few lines of Python code, we'll build a chatbot that can have conversations. sensible about anything that can be translated into a text message.
While the former is really good structurally, it has one major factual flaw: Google didn't develop this technology. The second example not only misrepresents the GPT-3 acronym, but is equal parts ridiculous and hilarious!
Implementation of a chatbot with GPT-3
How can we use this great technology? Build a chatbot, of course!
We can "prime" the engine with a sample interaction or two between the user and the AI to set the tone for the bot. We then add the actual user prompt at the end and let the engine generate the response.
Here's an example of an initial query that would work for a chatbot:
Human: Hello, who are you? AI: I'm fine. How can I help you today? Human: Who played Forrest Gump in the movie? AI:
The exchange on the first two lines will be made up for. This is to tell GPT-3 that we want alternate lines beginning with "Human:" and "AI:" and that the AI will answer questions. The third line is the user's first actual question.
When I fed this input to the engine, I got the following text:
So far, so good. Now suppose the user wants to ask another question. Here we have the possibility to integrate this last interaction into the input as this helps GPT-3 to have a better context of the conversation. This is how the input for a follow-up question could be:
Human: Hello, who are you? AI: I'm fine. How can I help you today? Human: Who played Forrest Gump in the movie? AI: Tom Hanks. Human: Did you have other important roles? AI:
You can probably see why expanding the post with the previous question and answer was a good idea. The user will now ask for "him", assuming the chatbot remembers that the previous question was for Tom Hanks.
Fortunately, GPT-3 is up to the challenge. Here is the answer:
AI: Yes, he was the star of Big, Saving Private Ryan and You've Got Mail.
Ready to build a chatbot based on this simple algorithm? Let's do this.
instructional requirements
To follow this tutorial you will need the following items:
Create a Python virtual environment
Following Python best practices, as a first step to creating the chatbot, we will create a separate directory for the project and inside it avirtual environment. Then we will install the Python packages we need for our chatbot on it.
If you are using a Unix or Mac OS system, open a terminal and enter the following commands to perform the tasks described above:
$mkdir twilio-openai-bot$ cd twilio-openai-bot$ python3 -m venv venv$ source venv/bin/activate(venv) $pip installiere openai twilio Flask python-dotenv pyngrok
For those of you following the tutorial on Windows, type the following commands into a command prompt window:
$ md twilio-openai-bot$ cd twilio-openai-bot$ python -m venv venv$ venv\Scripts\activate(venv) $ pip installiere openai twilio matraz python-dotenv pyngrok
The last command usedlump
, the Python package installer to install the three packages we will be using in this project:
setting
As mentioned above, this project requires an OpenAI API key. As of this writing, the only way to get one is to be included in their private beta program. You canapply on your site.
The Python application needs to have access to this key, so we're going to create one.envfile to paste it. The application then imports it from this file as an environment variable.
A... create.envfile in your project directory (note the leading dot) and enter a single line of text containing:
OPENAI_KEY=Your-Openai-API-Key-here
How to work with this file is explained in the next section. If you plan to put your project under source control, make sure this file is excluded, as you don't want to accidentally share your OpenAI key.
Sending GPT-3 requests from Python
In this section we will create the support code that will allow us to work with the OpenAI GPT-3 engine. The code is stored in a file calledchatbot.py. Below is the initialization section of this file:
import osfrom dotenv import load_dotenvimport openaiload_dotenv()openai.api_key = os.environ.get('OPENAI_KEY')completion = openai.Completion()start_chat_log = '''Man: Hi, who are you? AI: I'm fine. How can I help you today?'''
Thatload_dotenv()
Function imports any variable stored in a.envfile as environment variables. Please note how we use themOPENAI_KEY
Variable on the next line to initialize OpenAI with the key. Thattermination
The variable contains the actual client in the engine. This is the object we will use to send requests.
I also added onestart_chat_log
Variable containing the two lines that suck the engine. Once the bot is up and running, I recommend trying different interactions with this variable to see how the bot changes its responses accordingly.
Now let's write a function that performs a GPT-3 query. Add the following function to the end ofchatbot.py:
def ask(ask, chat_log=None): wenn chat_log None ist: chat_log = start_chat_log prompt = f'{chat_log}Mensch: {ask}\nAI:' response = complete.create(prompt=prompt, engine="davinci", stop=['\nHuman'], temperature=0.9, top_p=1, frequency_penalty=0, presence_penalty=0.6, best_of=1, max_tokens=150) response = response.options[0].text.strip() gibt Antwort zurück
ThatQuestions()
The function takes the user's question as the first argument, followed by an optional chat record. If the chat record is not provided, use the functionstart_chat_log
However.
Thattimely
The variable is designed to contain the chat record followed by the user's question prefixedHuman:
. After the user prompt, we add the last line as read-onlyTO:
, which gives the GPT-3 engine the queue to generate an answer to the user's question.
Thatcompletion.create()
The function is where the request to the GPT-3 engine is actually made. This function accepts a series of arguments that are used to configure how the engine should complete the text. Here is a brief description of these arguments:
These aren't the only possible options, so I recommend you check them outOpenAI reference documentationto learn more ways to set up your request.
The response from the completion engine is an object containing aoptions
-Attribute, which is a list of endings. We didn't request multiple endings, so the list will only contain a single item. This item is a Python dictionary with aText
Key containing the generated text. Our function takes this text, removes any leading or trailing whitespace, and returns it to the caller. See as mentioned aboveAPI Documentationfor information about other data elements included in the GPT-3 response.
Let's launch a Python shell and play with itQuestions()
Function:
>>> from chatbot import ask>>> ask('Who played Forrest Gump in the movie?')'Wow, that's a tough question! Forrest Gump was played by Tom Hanks.'>>> ask('How long does it take to travel from Los Angeles to Dublin?')'It takes approximately 12 hours to fly from Los Angeles to Dublin. You may wish to fly via London Heathrow Airport.
Pretty cool right? What we're missing is the second part of our algorithm, where we add a question and its answer to the chat log so we can use it in the next question. We can implement a second function to update the chat history:
def append_interaction_to_chat_log(question, response, chat_log=None): wenn chat_log None ist: chat_log = start_chat_log return f'{chat_log}Mensch: {question}\nAI: {response}\n'
This feature takes a question and an answer and adds them to the end of the chat log. The updated chat record is returned.
Now we can have a conversation that preserves context. Try the following in a new Python shell session:
>>> from chatbot import ask, append_interaction_to_chat_log>>> chat_log = None>>> question = 'Who played Forrest Gump in the movie?'>>> answer = ask(question, chat_log)>>> answer'Forrest Gump is a 1994 American romantic comedy based on the 1986 novel of the same name by Winston Groom. The film was directed by Robert Zemeckis and adapted for the screen by Eric Roth. It stars Tom Hanks as Forrest Gump, for which he won an Academy Award for Best Actor and was nominated for Best Picture.'>>> chat_log = append_interaction_to_chat_log(question, answer, chat_log)>>> question = 'Was he him in Other great roles?'>>> answer = ask(question, chat_log)>>> answer'He starred in The Green Mile (1999), a dramatic film based on the Stephen King novel of the same name.'
These two functions are all we need to manage our chat. In the next few sections, we'll integrate them with Twilio's SMS messaging.
Configure Twilio Programmable Messaging
Next, let's set up an SMS-enabled Twilio phone number for the chatbot. If you haven't already created a free Twilio account,create it nowand log in.
Of theTwilio-Console, Selectphone numbersand then click the red plus sign to buy a Twilio number. On the Buy Number screen, you can select your country and review SMS in the Skills panel. If you want to request a number from your region, you can enter your area code in the Number field.
Click the Search button to see what numbers are available, then click Buy for the number you want from the results. After confirming your purchase, click the "Close" button. Please note that if you are using a test account, the funds for this purchase will come from your test balance.
Congratulations, you now have your own programmable phone number! Grab your smartphone and text your new number to make sure it works. You should get a response asking you to set up your number, which is what you'll do next.
ThatTwilio SMS APIuse aWebhookto notify an application when a message arrives. The automated response you received when you sent an SMS to your Twilio number actually came from a default webhook that Twilio installed on your number. The chatbot application will have a new webhook replacing the default webhook.
Defining a webhook is extremely easy with the Flask framework. Below is a simple app with a webhook definition. Copy the code below into a new file namedapp.py:
from Flask import Flask, request twilio.twiml.messaging_response import MessagingResponseapp = Flask(__name__)@app.route('/bot', method=['POST'])def bot(): incoming_msg = request.values['Body ' ] # Use the incoming message to generate the response here r = MessagingResponse() r.message('this is the response') return str(r)
This code creates a Flask app that has an endpoint with the URL/ bot. If Twilio is configured to know about this endpoint, it will send a request when an SMS arrives at the previously acquired phone number. The application can read the incoming message, do some processing, and generate a response that is returned at the end of the function.
You can launch the above application as follows:
(venv) $ Flask run * Serving Flask app "application" (Lazy Loading) * Environment: Production WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead. * Debug mode: disabled * Runs at http://127.0.0.1:5000/ (Press CTRL+C to exit)
The application is now running on your computerhttp://localhost:5000URL, but it is not accessible from the Internet. The next step is to provide a temporary public URL that we can pass to Twilio so it can send requests. This work is done by the ngrok tool. Leave the app running and open a second terminal. Activate the virtual environment in this terminal as in the first one at the beginning of the tutorial, then enter the following command:
(venta) $ ngrok http 5000
You should see a screen similar to the following:
Note the lines beginning with "Forwarding". These show the public URLs ngrok uses to redirect requests to our service, both inhttp://jhttps://Format. What we need to do now is tell Twilio to send incoming SMS notifications to this URL.
back toTwilio-Console, click onphone numbersand then to the phone number you purchased above. Scroll down to the Messages section, copy thehttps://URL of the ngrok output and then paste it into the Enter Message field. The endpoint we created above is shown below the/ bot
URL also/ botshould be added at the end of the ngrok root url. Make sure the request method is set toHTTP-POST
. Don't forget to click the blue Save button at the bottom of the page to record these changes.
Send another SMS to your Twilio phone number and you will now receive the response from the Flask app:
We now know how the GPT-3 engine runs and how to receive and reply to SMS messages with Twilio. In the next section we will integrate the two and have a complete chatbot project.
It is highly recommended that you create a free Ngrok account andInstall your Ngrok account authorization tokenon your computer to avoid restrictions of this service. Seethis blog postfor details.
SMS chatbot using OpenAI and Twilio
we expandapp.pyto send user messages to theQuestions()
Function we created earlier and to follow the chat. Here are the updated contents ofapp.py:
from Flask import Flask, Anfrage, Sitzung von twilio.twiml.messaging_response import MessagingResponsefrom chatbot import ask, append_interaction_to_chat_logapp = Flask(__name__)app.config['SECRET_KEY'] = 'top-secret!'@app.route('/bot' , method=['POST'])def bot(): incoming_msg = request.values['Body'] chat_log = session.get('chat_log') answer = ask(incoming_msg, chat_log) session['chat_log'] = append_interaction_to_chat_log (incoming_msg, answer, chat_log) r = MessagingResponse() r.message(respuesta) return str(r)
The most convenient place to save the chat log with a user is themeeting
Flask variable using aCookie-HTTPfor storage. Twilio works similarly to a web browser in this regard, accepting, storing (up to four hours), and forwarding cookies during a conversation with a user. When two or more users chat with the bot at the same time, each has their own cookie.
Because session cookies are signed, the Flask app must have a secret key configured. For security reasons, you should replace the placeholder value I used for the secret key with something unique.
Thatbot()
endpoint starts retrieving the user's message and the session's chat recording as before. Thatmeeting
Object uses the familiar dictionary syntax, iesession.get()
returns a previously saved chat record, if any, ornone
if this user is not logged in. This is convenient because the functions inchatbot.pyUse standard chat protocol if we approvenone
.
calling theQuestions()
The function works exactly as before. We pass the user's message and chat log and get the completion of the GPT-3 engine, which is our answer.
The question and answer are then added to the chat log, and the updated chat log is saved back into the user's session, so the entire chat history is used for the next user interaction.
To end the function, the response is sent back as a reply and sent back to the user via SMS.
Ready to test your new bot? If you're still running your Flask app, stop it with Ctrl-C and then run it again to get the latest updates. If you are no longer running ngrok, go to your second terminal and restart it, then follow the instructions in the previous section to update your webhook URL in the Twilio console, as ngrok creates a different URL each time it is run .
With the Flask app and ngrok running, you can start sending SMS to the bot. As before, send your first question to the Twilio phone number and an answer should arrive within seconds. Feel free to try the bot on different topics, it knows a lot!
In the example interaction below, it looks like the bot is losing patience with my stupid questions, but you can see how this conversation feels very natural if you keep the chat log in the user's session:
Conclusion
I hope you had as much fun with this project as I did! The possibilities of the GPT-3 completion engine are endless, so I encourage you to experiment with it, adapting this project to create other original applications besides chat.
If you build something cool, I'd love to hear about it!
Miguel Grinbergis a Python developer for technical content at Twilio. approach himmgrinberg [at] Twilio [Punkt] comIf you have a cool Python project that you want to share on this blog!
authors
Proofreader