Jaeyun's Blog
For deep learning
홈
카테고리
태그
아카이브
About
검색
categories
카테고리
블로그 시작
10-12
Prompt Engineering Guide - COT & TOT
07-09
Prompt Engineering Guide - Intruduction
07-02
ChatGPT를 학습하는 방법
06-16
Training Compute-Optimal Large Language Models
06-05
추천 모델과 LLM - 2
05-28
추천 모델과 LLM - 1
05-19
AutoGPT 알아보기
05-07
Langchain 훑어보기
04-29
LangChain이란?
04-23
You Only Search Once (YOSO)
01-24
A Style-Based Generator Architecture for GANs - 2
01-16
A Style-Based Generator Architecture for GANs - 1
01-14
Recurrent World Models Facilitate Policy Evolution(World Models) - 2
01-09
Recurrent World Models Facilitate Policy Evolution(World Models) - 1
01-08
SENet(Squeeze and excitation networks)
07-18
SQEEZENET(모델 압축)
05-26
ENAS(Efficient Neural Architecture Search via Parameter Sharing)
03-15
Siamese Neural Networks for One-shot Image Recognition(샴 네트워크)
02-06
Born Again Neural Networks
01-27
Globally and Locally Consistent Image Completion(이미지 복구)
01-05
WESPE (Weakly Supervised Photo Enhancer for Digital Cameras)
12-14
캡슐 네트워크(캡스넷 - Capsnet) - 3
11-29
캡슐 네트워크(캡스넷 - Capsnet) - 2
11-28
캡슐 네트워크(캡스넷 - Capsnet) - 1
11-28
A simple neural network module for relational reasoning - 2
10-19
A simple neural network module for relational reasoning - 1
10-19
DenseNet(Densely connected Convolutional Networks) - 3
10-16
DenseNet(Densely connected Convolutional Networks) - 2
10-15
DenseNet(Densely connected Convolutional Networks) - 1
10-13
k-최근접이웃 기반 이상치 탐지(k-NN based Novelty Detection)
01-29
로컬 아웃라이어 팩터(Local Outlier Factors)
11-10
커널 밀도 추정(Kernel density estimation) - Parzen window density estimation
11-08
혼합 가우시안 밀도 추정법(Mixture of Gaussian Density Estimation)
11-03
가우시안 밀도 추정법(Gaussian density estimation)
11-02
Novelty Detection(이상치 탐지) - Overview
10-18
생성 모델(Generative model)
12-08
셀프 트레이닝(self training)
12-07
Semi-supervised Learning(준지도학습) - Overview
12-04