All Courses

Why is the accuracy of the training dataset not always 100% ?

By, a month ago
  • Bookmark

Why is the accuracy of the training dataset not always 100% when we use the same dataset to train the model?

Training dataset
1 Answer

The accuracy of a model on the training dataset is not always 100% because of a phenomenon called overfitting. Overfitting occurs when a model learns to fit the training data so well that it becomes too specific to the training data and fails to generalize well to new, unseen data.

When a model is overfitting, it may memorize the training data instead of learning the underlying patterns that generalize to new data. This can result in a high accuracy on the training data but poor performance on new, unseen data.

To prevent overfitting, it is important to use techniques such as regularization, cross-validation, and early stopping to prevent the model from becoming too complex and overfitting the training data. These techniques can help the model generalize well to new data, resulting in better overall performance.

Your Answer


Live Masterclass on : "How Machine Get Trained in Machine Learning?"

Mar 30th (7:00 PM) 516 Registered
More webinars

Related Discussions

Running random forest algorithm with one variable

View More