What was ChatGPT trained on?

Create image :- google

Create image :- google

ChatGPT owes its success to its ability to automatically generate text like a human, like Tay from Microsoft or Galactica from Meta

Create image :- google

GPT-3

A series of GPT models are composed of language models based on transformer technology

Create image :- google

Learning methods

Step 1: Collect demonstration data and train generation rules in supervision. This first step corresponds to the fine-tuning of the GPT-3.5 model

Create image :- google

Learning methods

Step 3 : Optimize moderation principles in terms of reward model using Reinforcement Learning (RL) algorithm.

Create image :- google

Conversational

Containing PPO model and optimized GPT-3.5 model, the moderation API is being used since the chatbot provides answers

Create image :- google

Limitations

Like any text system, ChatGPT can generate nonsense text based on what it retains from the model language

Create image :- google

Conclusion

ChatGPT is a machine, an algorithm, so we can get caught up in the game of asking it many questions, so the machine is divine because of its extra knowledge