[note]Some sample code of python#3

Ray Lin
學以廣才
Published in
2 min readDec 30, 2020

--

筆記程式方便查詢

Removing duplicates in the lists

>>> t = [1, 2, 3, 1, 2, 5, 6, 7, 8]
>>> t
[1, 2, 3, 1, 2, 5, 6, 7, 8]
>>> list(set(t))
[1, 2, 3, 5, 6, 7, 8]

Evaluation Metrics


# print(" FN FP TP pre acc rec f1")
#print(FN, FP, TP, FN+FP+TP+TF)
precision = TP / (TP + FP)
print(f"precision: {precision:4.2f}")
accuracy = (TP + TN)/(TP + TN + FP + FN)
print(f"accuracy: {recall:4.2f}")
recall = TP / (TP + FN)
print(f"recall: {recall:4.2f}")
f1_score = 2 * precision * recall / (precision + recall)
print(f"f1_score: {f1_score:4.2f}")
# print(f"{FN:6.2f}{FP:6.2f}{TP:6.2f}", end="")
# print(f"{precision:6.2f}{accuracy:6.2f}{recall:6.2f}{f1_score:6.2f}")

Confusion Matrix — Get Items FP/FN/TP/TN — Python

import sklearn    
from sklearn.metrics import confusion_matrix
actual = [1, -1, 1, 1, -1, 1]
predicted = [1, 1, 1, -1, -1, 1]
confusion_matrix(actual, predicted)
>>>
array([[1, 1],
[1, 3]])
c = confusion_matrix(actual, predicted)
TN, FP, FN, TP = c[0][0], c[0][1], c[1][0],c[1][1]

Set a threshold of IoU to determine if the object detection is valid or not

Let’s say you set IoU to 0.5, in that case (From Renu Khandelwal)

  • True Positive(TP): if IoU ≥0.5, classify the object detection as TP
  • False Positive(FP): if Iou <0.5, then it is a wrong detection and classify it as FP
  • True Negative (TN): TN is every part of the image where we did not predict an object. This metrics is not useful for object detection, hence we ignore TN.
  • False Negative(FN): When a ground truth is present in the image and model failed to detect the object, classify it as FN.

Set IoU threshold value to 0.5 or greater(0.5, 0.75. 0.9 or 0.95…).

Reference

--

--

Ray Lin
學以廣才

善歌者,使人繼其聲;善教者,使人繼其志。其言也,約而達,微而臧,罕譬而喻,可謂繼志矣