FLOPs (floating point operations) 간단개념정리

<하드웨어>/GPU|2022. 9. 29. 11:56
반응형

FLOPS vs FLOPs

  • FLOPS Be careful All capitals yes floating point of per second Abbreviation , Floating point operations per second . It can be understood as computing speed , Used to measure the performance of hardware . (=하드웨어성능측정)
  • FLOPs yes floating point of operations Abbreviation , Is the number of floating point operations , It means the amount of calculation , It can be used to measure the algorithm / Model complexity .(=모델 연산량측정, floating point operations )

 

FLOPS vs MAC(Multiply–ACcumulate)

  • FLOPs 는 덧셈과 곱셈을 하나의 연산으로 본다
  • 1 MAC = 2 FLOPs(덧셈1,곱셈1)
  • 그래서 MAC → FLOPs으로 변환시에 2를 곱해주고, 반대일떈 2를 나눠주면됨

 

FLOPs랑 걸리는시간도 비례?

비례하긴한데 모델별로 그래프모양은 다른듯

 

Calculate FLOPs

y = w[0]*x[0] + w[1]*x[1] + w[2]*x[2] … w[n-1]*x[n-1]

  • 길이가 n인 두벡터의 내적 → 곱하기 n, 더하기 n-1 총 2n-1 FLOPS

FLOPs = 2x Input Size x Output Size = 2x Number of Kernel x Kernel Shape x Output Shape

 

딥러닝 모델에서 고려해야할사항

  • 모델의 크기 (FLOPs)
  • 메인 메모리 얼마나 잡아먹는지(사용메모리)
  • 동작시간 (Latency)
  • 전력소모

 

Parameter and FLOPs?

Parameter ??

In a CNN, each layer has two kinds of parameters : weights and biases.

The total number of parameters is just the sum of all weights and biases.

Let's define, = Number of weights of the Conv Layer. = Number of biases of the Conv Layer.

 

 

반응형

댓글()