II. GPT
So, what is GPT?
GPT is a neural network machine learning model trained using internet data to generate any type of text. In simpler words, it requires a small amount of input text and will generate large volumes of relevant and sophisticated machine-generated text. There are a variety of functionality in GPT-3 which includes, text completion, code, completion and chat assistance, etc.
The GPT-3's simulation of human speech intonation, content, and logic is derived from a large amount of data training. But it is not difficult to see from the conversation with it that the context of its dialect is indeed machine-like and designed to be rational. The official GPT-3 does not understand the details of the user's emotional input and can only respond with plain and general answers, which inspires me to think whether the result will be more human if we use more personalized and emotional text data to train GPT-3.