ChatGPT AI/ML

¡Supera tus tareas y exámenes ahora con Quizwiz!

Gradient Vanishing/Exploding Problem

A challenge in training deep networks where gradients become too small or too large, hindering learning.

Neural Network

A computational model inspired by the human brain, consisting of interconnected nodes or neurons.

Convolutional Neural Network (CNN)

A deep learning model particularly effective for image and spatial data processing.

Principal Component Analysis (PCA)

A dimensionality reduction technique that transforms data into principal components.

Natural Language Processing (NLP)

A field of AI focused on enabling machines to understand, interpret, and respond to human language.

Activation Function

A function applied to a neuron's output to introduce non-linearities, such as ReLU or Sigmoid.

Softmax

A function converting logits into probabilities for classification tasks in neural networks.

Bayesian Network

A graphical model representing probabilistic relationships among variables using Bayes' theorem.

Gradient Boosting

A machine learning technique that builds models sequentially to correct errors of previous models.

Loss Function

A mathematical function that quantifies the difference between predicted and actual outputs in a model.

Ensemble Learning

A method that combines multiple models to improve overall performance, like bagging or boosting.

Monte Carlo Simulation

A method using repeated random sampling to estimate statistical properties of a system.

IoU (Intersection over Union)

A metric used in computer vision to measure the overlap between predicted and ground truth regions.

Generative Adversarial Network (GAN)

A model comprising a generator and a discriminator that work adversarially to produce realistic outputs.

Pretrained Model

A model that has already been trained on a large dataset and can be fine-tuned for specific tasks.

Transformer

A neural network architecture designed for handling sequential data, foundational to many NLP tasks.

Recurrent Neural Network (RNN)

A neural network designed for sequential data processing, like time series or text.

Autoencoder

A neural network that learns to encode and decode data, often used for dimensionality reduction.

Markov Chain

A probabilistic model describing a sequence of states with transitions dependent only on the current state.

Dropout

A regularization technique that randomly drops neurons during training to prevent overfitting.

k-Nearest Neighbors (k-NN)

A simple algorithm that classifies data based on the nearest neighbors in feature space.

Decision Tree

A simple predictive model that splits data into branches to make decisions based on conditions.

Underfitting

A situation where a model is too simple to capture the underlying patterns in the data.

Overfitting

A situation where a model learns the training data too well, resulting in poor generalization to unseen data.

Deep Learning

A subset of machine learning using neural networks with many layers to learn from large amounts of data.

Support Vector Machine (SVM)

A supervised learning algorithm that finds the best hyperplane to classify data.

Recommender System

A system that suggests relevant items to users, such as products, movies, or music.

One-Hot Encoding

A technique for representing categorical data as binary vectors.

t-SNE

A technique for visualizing high-dimensional data in lower dimensions, preserving local structures.

Cross-Validation

A technique to evaluate model performance by splitting the data into training and validation subsets.

Transfer Learning

A technique where a model trained on one task is reused for a related task.

Generative AI

A type of AI model that generates new content, such as text, images, or music, based on learned patterns.

Long Short-Term Memory (LSTM)

A type of RNN that can learn long-term dependencies in data, overcoming vanishing gradient problems.

Artificial Neural Network (ANN)

A type of neural network designed for tasks like classification, regression, and more.

ReLU (Rectified Linear Unit)

A widely used activation function that outputs the input if positive, else 0.

Adam Optimizer

A widely used optimization algorithm combining momentum and adaptive learning rates.

Sigmoid Function

An activation function that maps inputs to a range between 0 and 1, often used for binary classification.

Reinforcement Learning

An agent learns to make decisions by performing actions and receiving rewards or penalties.

Backpropagation

An algorithm for training neural networks by calculating and propagating the gradient of the loss function.

Random Forest

An ensemble learning method using multiple decision trees for better accuracy and robustness.

Gradient Descent

An optimization algorithm used to minimize the loss function by updating model parameters iteratively.

Clustering

An unsupervised learning task to group similar data points together, such as k-means clustering.

Ethics in AI

Considerations about the fairness, accountability, and societal impact of AI systems.

Sparse Data

Data with many missing or zero values, common in fields like NLP or recommender systems.

Big Data

Extremely large datasets that require specialized tools for storage, processing, and analysis.

Unsupervised Learning

Learning from unlabeled data to discover patterns or structures, such as clustering or dimensionality reduction.

Supervised Learning

Machine learning approach where models are trained on labeled data to predict outputs for unseen data.

Cloud ML

Machine learning services provided on cloud platforms, such as AWS SageMaker or Google AI Platform.

Hyperparameter

Model parameters set before training that influence learning behavior, such as learning rate or batch size.

Epoch

One full pass through the training dataset during the training process.

Data Augmentation

Techniques to increase the diversity of data by creating modified versions of the dataset.

Regularization

Techniques to prevent overfitting, such as L1, L2 regularization, or dropout.

Explainability

The ability to understand and interpret how machine learning models make predictions.

Bias

The error introduced by approximating a real-world problem with a simplified model.

Batch Size

The number of samples processed before updating the model parameters during training.

Prompt Engineering

The practice of crafting effective prompts to improve generative AI model outputs.

Fine-Tuning

The process of adapting a pretrained model to a specific task or domain.

Tokenization

The process of breaking text into smaller units, such as words or subwords, for NLP tasks.

Feature Engineering

The process of selecting, modifying, or creating features to improve model performance.

Variance

The sensitivity of a model to fluctuations in the training data, potentially leading to overfitting.

Perceptron

The simplest type of neural network, used for binary classification tasks.

Learning Rate

The step size for adjusting model parameters during gradient descent.


Conjuntos de estudio relacionados

Modern Art Quiz 2, Modern Art Quiz 1

View Set

Chapter 5 Sports in Contemporary Society Study Guide

View Set

Psych Chap 11 Motivation and Emotion

View Set