Suppose you have 10 samples, where 8 are positive and 2 are negative, how to calculate Entropy (important to know)

Question

A bit mathematical

in progress 1
TheDataMonk 55 years 21 Answers 1803 views Grand Master 0

Answers ( 21 )

  1. E(S) = 8/10log(8/10) – 2/10log(2/10)
    Note: Log is a base 2

  2. -8/10log(8/10) -2/10log(2/10)

  3. -8/10log(8/10) -2/10log(2/10)

    0

    -8log(8/10) – 2log(2/10)

  4. Entropy can be calculated for a random variable X with k in K discrete states as follows:

    E(X) = -sum(for each k in K p(k) * log(p(k)))

    For the example above
    E(X) = -(8/10)*log(8/10) – (2/10)*log(2/10)

  5. E(X) = -(8/10)*log(8/10) – (2/10)*log(2/10) = 0.72 (approx.)

  6. Sorry it a private answer.

  7. Entropy is nothing but the measure of disorder or uncertainty and the goal of machine learning models and Data Scientists in general is to reduce uncertainty.
    E(S)= SUM OF ALL(-Pi * log2(Pi))
    for above give case p1=8 and p2=2
    so E(S)= -(2/10)*log(2/10) -(8/10)*log(8/10) = ~ 0.72

  8. -(8/10)*log(8/10) – (2/10)*log(2/10) = 0.72
    log is base 2

  9. In a binary classification problem (two classes – positive and negative), Entropy can be calculated as follows:
    Here 0 – negative, 1 – positive.

    [Note: Log is with base 2]

    -(p(0) * log2(P(0)) + p(1) * log2(P(1)))

    = – ((2/10)*log2(2/10) + (8/10)*log2(8/10))
    = 0.7219280948873623

  10. Entropy is a measure of disorder or uncertainty and the goal of machine learning models and Data Scientists, in general, is to reduce uncertainty.

    Using the entropy formula, for the above example:
    E(X) = – (8/10) * log(8/10) – (2/10) * log(2/10)

    Note: In the formula, ‘Pi’ is simply the frequentist probability of an element/class ‘i’ in our data. For simplicity’s sake let’s say we only have two classes, a positive class and a negative class. Therefore ‘i’ here could be either + or (-).
    Log is to be base 2.

  11. entropy = -(8/10)* log(8/10) – (2/10) * log(2/10) = 0.72 approx

    with log base 2

  12. Entropy:
    H(s) = -(8/10)*( log (8/10)/ log (2))) – (2/10)*( log (2/10)/ log(2)))
    H(s)=0.723

  13. Applying the formula for Entropy:
    E(s)= -summation( p(i) * log(p(i)) (base 2) )
    E(s) = – ((8/10) * log(8/10) + (2/10) * log(2/10))
    E(s) ~ 0.72

  14. -(8/10)*log(base2)(8/10) – (2/10)*log (base 2)(2/10)=0.72

  15. -(8/10)*log(base2)(8/10) – (2/10)*log (base 2)(2/10)=0.72

  16. Entropy= -8/10log(8/10) -2/10log(2/10)

  17. entropy = -(8/10)* log(8/10) – (2/10) * log(2/10) = 0.72 approx

    with log base 2

    0

    E(s) = -8/10log2(8/10) – 2/10log2(2/10)

Leave an answer

Browse
Browse