Deep Learning/Misc.

[Deep Learning] 용어 및 개념 정리

sdbeans 2022. 1. 23. 16:57

Deep learning 공부하면서 용어를 정리해봅니다.

 

A

anomaly maps

in unsupervised learning for anomaly detection, by using the self-organizing map method, anomalies are mapped to produce low-dimensional representation

 

C

contrastive learning

just as the name suggests, this method learns the contrasting differences of the inputs; comparing the similarities and dissimilarities of input features; subset of self-supervised learning

goal is to make and generalize representations (along with self-supervised learning)

 

contrastive loss

a type of training objective for deep contrastive learning; measures how much samples from same class are similar to each other and how much samples from different classes are dissimilar to each other

 

contrastive representation learning

in a set of data sample, similar samples are close to each other, while dissimilar samples are far apart from each group of similar samples;

grouping similar samples and also separating the different groups;

 

D

downsampling

a.k.a. pooling. As a layer in CNN, this layer gets high resolution (image) data and compresses to reduce dimensionality of the data. This process minimizes the possibility of overfitting, and it also helps to highlight the important structural features of the data.

 

I

inductive learning

a semi-supervised learning algorithm that generalizes labeled training data to new dataset like testing dataset

라벨링이 된 훈련 데이터를 일반화시켜서 테스트 셋과 같은 새로운 데이터 셋을 생성한다

 

L

linear method (linear classifier)

not linear computation, but meaning that the decision boundary is linear (e.g. logistic regression, softmax

 

M

meta learning

a method to learn how to learn

 

R

representation learning

learn how to produce new features that represent(표현) the input

done by processing the input, through feature maps

to make the subsequent learning tasks performed easily

 

S

seasonality correction

a method to adjust time-series data to remove seasonal factors and effects of patterns that depend on seasons.

 

self-supervised learning (자기지도학습)

subset of unsupervised learning; does not need labeled data; creates self-defined pseudo labels as supervision and learns representations, which are then used in downstream tasks;
mainly used in GANs and contrastive learning; goal is to make and generalize representations (along with contrastive learning)

 

semi-supervised learning (준지도학습)

somewhere between supervised and unsupervised learning; small number of labeled samples with large amount of unlabeled samples; often used for tasks where labeled data are rare or expensive; efficient semi-supervised learning algorithm is if performance is better than supervised learning algorithm.

 

T

transductive learning

a semi-supervised learning algorithm that generalizes labeled training data to unlabeled training data

라벨링이 된 훈련 데이터를 사용하여 라벨링이 되지 않은 훈련 데이터로 일반화 시킴

 

transfer learning (전이학습)

a model trained for one task is used for another task;

can improve performance of the second task; related to multi-task learning; only works in deep learning if the first task is general, and not too specific