ChatGPT

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Facts about ChatGPT

- 51% of IT professionals predict we will witness a successful cyberattack with the help of ChatGPT by the end of the year - ChatGPT has passed the US Med licensing exam, law school exams, and Wharton's MBA exams - Daily cost of running ChatGPT is $700,000

How does ChatGPT work?

- ChatGPT does not think, it predicts - It predicts what would word comes next and makes a sentence off of that - Every time you ask it a question it goes through all of its little pieces of text and chooses the one that makes the most sense

Layers of NNA

- Input (your eyes seeing a picture of an animal) - Hidden (neurons detecting fur, wings, legs) - Output (neuron with "highest activation"

Time it took Platform to reach 1 million users

- Netflix (3.5 years) - Facebook (10 months) - Spotify (5 months) - Instagram (2 months) - ChatGPT (5 days) *Note: ChatGPT has the fastest ever growth to reach 1 million users

1 gigawatt hour (GWh) is enough to power approximately...

33,000 average US households for a day - Shit ton of power to use ChatGPT - About 10 times the energy required for a Google search

ChatGPT

A controversial product built on top of GPT models

LLM

A model trained on tons of text to predict langauge patterns

Bookcorpus

Book 1 and Book 2 - an attempt to get every book that has ever been digitized and published online into ChatGPT

Feedback Loop

ChatGPT continuously learns and adapts. It doesn't learn in real-time during our conversation, but feedback from interactions helps improve future versions of the model. This feedback loop can include user feedback, corrections, and updates from developers.

What does GPT stand for?

Generative Pretrained Transformer

Data Collection

Initially, a large dataset is gathered. In the case of ChatGPT, this dataset includes vast amounts of text from the internet, books, articles, conversations, and more.

How was ChatGPT created?

OpenAI used an LLM - Common Crawl - WebText2 - Bookcorpus - Wikipedia

Training

The collected data is used to train the model. During training, the model learns to understand patterns, relationships, and context within the text. For example, it learns that certain words often follow others, or that certain phrases convey specific meanings.

Where did ChatGPT come from?

The process started around 10 years ago with efforts picking up in 2018

What is a Generative Pretrained Transformer?

a specific type of LLM

Dall-E

an AI tool where you can type a text description and it can generate an image for you

Democratized AI

everyone can use it

Wikipedia

everything in Wikipedia is in ChatGPT

5 Steps to Deep Learning

going through all of the data and put it into the system - Data collection - Training - NNA - Fine Tuning - Feedback Loop

Neural Network Architecture (NNA)

it is human beings attempt to get a machine to think like the human brain thinks

WebText2

looked for Reddit comments that had 3 or more upvotes - Reddit focused

Common Crawl

looks at as many webpages it can and archives them - what people have published on the internet

Turing Test

something passes the test if a machine gets you to believe it is a human - has been the goal of ChatGPT

Large Language Model (LLM)

the company tried to collect as much words as possible from as many sources as possible - the thing that ChatGPT is built on

GPT role in ChatGPT

the engine

Role of ChatGPT in ChatGPT

the interface

LLM role in ChatGPT

the technology

Fine Tuning

what makes more sense, compares answers to itself over and over again - does fine tuning on specific tasks to improve its performance

Hallucination

when ChatGPT gives you an answer that is completely not true - Ex: bad at citations


Ensembles d'études connexes

Bio1100 Final Exam with study guide and all chapters

View Set

chapter 22 Europe and America in the early 1900 century

View Set

AP Gov Exam: Constitutional Foundations

View Set

Ch 13 Cardiovascular Alterations

View Set

تربية اسلامية توجيهي

View Set