AUC metric value is 0.48, what does that mean?
1. Whatever evaluation you did is average and may or may not be correct
2. Whatever evaluation you did is highly accurate
3. Whatever evaluation you did is more like taking a guess on the result.
4. Whatever evaluation you did is highly in-accurate
Correct Answer : 3 Exp : As per AWS Documentation
The actual output of many binary classification algorithms is a prediction score. The score indicates the systems certainty that the given observation belongs to the positive class (the actual target value is 1).
Binary classification models in Amazon ML output a score that ranges from 0 to 1. As a consumer of this score, to make the decision about whether the observation should be classified as 1 or 0, you interpret the score
by picking a classification threshold, or cut-off, and compare the score against it. Any observations with scores higher than the cut-off are predicted as target= 1, and scores lower than the cut-off are predicted as
target= 0.
In Amazon ML, the default score cut-off is 0.5. You can choose to update this cut-off to match your business needs. You can use the visualizations in the console to understand how the choice of cut-off will affect
your application.
Measuring ML Model Accuracy
Amazon ML provides an industry-standard accuracy metric for binary classification models called Area Under the (Receiver Operating Characteristic) Curve (AUC). AUC measures the ability of the model to predict a higher
score for positive examples as compared to negative examples. Because it is independent of the score cut-off, you can get a sense of the prediction accuracy of your model from the AUC metric without picking a threshold.
The AUC metric returns a decimal value from 0 to 1. AUC values near 1 indicate an ML model that is highly accurate. Values near 0.5 indicate an ML model that is no better than guessing at random. Values near 0 are
unusual to see, and typically indicate a problem with the data. Essentially, an AUC near 0 says that the ML model has learned the correct patterns, but is using them to make predictions that are flipped from reality
(0s are predicted as 1s and vice versa). For more information about AUC, go to the Receiver operating characteristic page on Wikipedia.
The baseline AUC metric for a binary model is 0.5. It is the value for a hypothetical ML model that randomly predicts a 1 or 0 answer. Your binary ML model should perform better than this value to begin to be valuable.
3