티스토리 뷰
| 0. 강의 자료
CS231n: Convolutional Neural Networks for Visual Recognition
- CS231n 전체 강의자료: 강의 자료 링크
- CS231n 2강: Youtube 링크
- CS231n_2강 강의자료: slide.pdf
| 1. 강의 목표
- K-Nearest Neighbor
- Linear classifiers: SVM, Softmax
- Two-layer neural network
- Image features
| 2. Image Classification
The Problem: Semantic Gap
An image is just a big grid of numbers between [0, 245]:
컴퓨터는 이미지를 0부터 245까지 픽셀로 본다. 따라서, 컴퓨터가 이미지 분류하기 위해서 몇 가지의 장애가 있다.
- Viewpoint variation
- Illumination
- Deformation (변형)
- Occulsion (가려짐)
- Background Clutter (배경과 유사)
- Intraclass variation (클래스 내부 분산)
이미지 접근 방법
- Collect a dataset of images and labels
- Use Machine Learning to train a classifier
- Evaluate the classifier on new images
First classifier: Nearest Neighbor
- Memorize all data and labels
def train(images, labels): # Machine learning return model
- Predic the label of the most similar training image
def predict(model, test_images): # Use model to predict labels return test_labels
| 3. K-Nearest Neighbors
Parameter of K-Nearest Neighbors
- K
- Distance Metric
| K closest points
Instead of copying label from nearest neighbor, take majority vote from K closest points
| Distance Metric(L1, L2)
| 4. Hyperparameter(Machine Learning)
In machine learning, a hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are derived via training.
These are hyperparameters: *choices about the algorithm that we set rather than learn.
이미지 분류에선 잘 사용하지 않는 이유
- very slow at test time
- Distance metrics on pixels are not informative
- Curse of dimensionality
| 5. Linear Classification
In the field of machine learning, the goal of statistical classification is to use an object's characteristics to identify which class (or group) it belongs to. A *linear classifier * achieves this by making a classification decision based on the value of a linear combination of the characteristics. An object's characteristics are also known as feature values and are typically presented to the machine in a vector called a feature vector.
- Hard cases for a linear classifier
'AI > CS231n' 카테고리의 다른 글
cs231n 8강 정리 - Visualizing and Understanding (0) | 2022.12.05 |
---|---|
cs231n 7강 정리 - Training Neural Networks (0) | 2022.11.13 |
cs231n 5강 정리 - Convolutional Neural Networks (0) | 2022.08.15 |
cs231n 4강 정리 - Introduction to Neural Networks (0) | 2022.08.15 |
CS231n 3강 정리 - Loss Functions and Optimization (0) | 2022.08.14 |
- Total
- Today
- Yesterday
- Prompt
- 구글드라이브다운
- 구글드라이브서버다운
- 딥러닝
- 구글드라이브연동
- prompt learning
- 파이썬 딕셔너리
- python
- 서버구글드라이브연동
- 도커 컨테이너
- 데이터셋다운로드
- cs231n
- style transfer
- few-shot learning
- 파이썬 클래스 계층 구조
- 구글드라이브서버연동
- CNN
- 프롬프트
- 서버에다운
- NLP
- 파이썬
- Unsupervised learning
- support set
- docker
- 퓨샷러닝
- stylegan
- clip
- 파이썬 클래스 다형성
- 도커
- vscode 자동 저장
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |