You have created a model for fire alarm, explain confusion matrix with this example

Question

Example of confusion matrix

solved 0
TheDataMonk 4 years 7 Answers 1232 views Grand Master 0

Answers ( 7 )

  1. A model for fire alarm will generally predict if the alarm will ring or not given a certain set of features.
    The model will output a confusion matrix which will consists of True Positives (TP), True Negatives (TN),
    False Positives (FP) and False Negatives (FN).
    Predicted
    1 0
    ____________________
    1 | TP | FN |——-> P
    Actual |________ |_________|
    0 | FP | TN |——–> N
    |_________|________ |

    True Positive – The model predicted the alarm rings and it actually rings
    True Negative – The model predicted the alarm did not ring and it actually did not ring.
    False Positive – The model predicted the alarm rings but it actually did not ring.
    False Negative – The model predicted the alarm does not ring but it actually rings.

    Accuracy is given as
    Accuracy = (TP + TN)/(TP +TN + FP +FN) ——–> It shows how often is the classifier correct.
    Sensitivity (Recall) = TP/P ——–> When it is actually positive, how often does it predict positive.
    Specificity = TN/N ——–> When it is actually negative, how often does it predict negative.
    Precision = TP/(TP +FP) ——–> Out of the totally predicted positive, how many are really positive.
    F1 Score = ( 2*precision*recall)/(precision + recall)

    Best answer
  2. Sorry it a private answer.

  3. The model for fire alarm will detect if there is a fire or not. The null hypothesis here is that the house is not on fire. The alternative hypothesis is that there is fire.
    The confusion matrix will have four values: True Positives (TP), True Negatives (TN), False Positives (FP) and False Negatives (FN).
    1) TP: There is fire and the alarm rings
    2) TN: There is fire but the alarm does not ring (Type I Error)
    3) FP: There is no fire and the alarm doesn’t ring
    4) FN: There is no fire and the alarm rings (Type II Error)

    We can test the accuracy by:
    1) Accuracy = (TP + TN)/(TP +TN + FP +FN)
    It shows how often is the classifier correct
    2) Sensitivity (Recall) = TP/TP+FN
    When it is actually positive, how often does it predict positive
    3) Specificity = TN/TN + FP
    When it is actually negative, how often does it predict negative
    4) Precision = TP/(TP +FP)
    Out of the total predicted positive, how many are really positive.
    5) F1 Score = ( 2*precision*recall)/(precision + recall)

  4. A model for fire alarm means it predicts if the alarm will ring or not during emergencies.
    A confusion matrix consists of True Positives (TP), True Negatives (TN), False Positives (FP) and False Negatives (FN).

    Actual True – The alarm rings
    Actual False – The alarm doesn’t ring

    Positive – Predicted by model that the alarm rings
    Negative – Predicted by model that the alarm doesn’t ring

    True Positive – The model predicted that the alarm rings and it actually rings
    True Negative – The model predicted the alarm did not ring and it actually did not ring.
    False Positive – The model predicted the alarm rings but it actually did not ring.
    False Negative – The model predicted the alarm does not ring but it actually rings.

  5. A model for fire alarm will generally predict if the alarm will ring or not given a certain set of features.
    The model will output a confusion matrix which will consists of True Positives (TP), True Negatives (TN),
    False Positives (FP) and False Negatives (FN).
    True Positive – The model predicted the alarm rings and it actually rings
    True Negative – The model predicted the alarm did not ring and it actually did not ring.
    False Positive – The model predicted the alarm rings but it actually did not ring.
    False Negative – The model predicted the alarm does not ring but it actually rings.

    Accuracy is given as
    Accuracy = (TP + TN)/(TP +TN + FP +FN) ——–> It shows how often is the classifier correct.
    Sensitivity (Recall) = TP/TP+FP ——–> When it is actually positive, how often does it predict positive.
    Specificity = TN/TN+FN——–> When it is actually negative, how often does it predict negative.
    Precision = TP/(TP +FP) ——–> Out of the totally predicted positive, how many are really positive.
    F1 Score = ( 2*precision*recall)/(precision + recall)

  6. Fire alarm prediction:

    It could only have two outcomes. Either, it could ring or not ring. I will be categorising ring as 1 and not ring as 0.
    True positive – It has been predicted that it will ring and actually it rings
    True Negative – It has been predicted that it will not ring and in actual it did not ring
    False positive – It has been predicted that it will ring but in actual it did not ring. It is also known as type 1 error.
    False negative – It has been predicted that it will not ring but in actual it did ring. It is also known as type 2 error.

    There are some terms associated with the confusion matrix which are as follows:
    Accuracy: Out of all the classes, how much we predicted correctly. ((tp+tn)/(tp+tn+fp+fn))
    Recall(r): Out of all the positive classes, how much we predicted correctly. (tp/(tp+fn))
    Precision(p): Out of all the positive classes we have predicted correctly, how many are actually positive. (tp/(tp+fp))
    F score: F-score helps to measure Recall and Precision at the same time. It uses Harmonic Mean in place of Arithmetic Mean by punishing the extreme values more. ((2*p*r)/(p+r))

Leave an answer

Browse
Browse