You have created a model for a problem in Artificial Intelligence/Machine Learning that works perfectly and achieves 90 to 95% accuracy. Again, you take the same model and implement it in another problem; you found that it performs poorly and achieves 50 to 55% accuracy. This judgment makes the assumption that one model cannot fit all situations.
For example, you want to drive from home to the office in the morning peak traffic hours, and you have a presentation in 40 mins at the office in person, but when you look at GPS, it shows 50 mins drive time. You choose another route, then it shows only 30 mins, but you need to go the extra 5 miles and chosen this route to reach the office, and you did arrive before time. You may think that then “Can we always choose an extra 5 miles route?” as default settings, then the answer will be “No.” The reason being is if you drive the car at Non-peak hours, then you are paying extra gas to your vehicle.
So, the one model created in Artificial Intelligence/Machine Learning may not fit all problems.
David Wolpert’s 1996 paper “No Free Lunch Theorem” quoted as below:
“If algorithm A outperforms algorithm B on some cost functions, then loosely speaking there must exist exactly as many other functions where algorithm B outperforms.”
As per author and expert Aurelien Geron in his book “Hands-on Machine Learning with Scikit-Learn, Keras, & TensorFlow.”
To decide what data to discard and what data to keep, you must make assumptions. For example, the line model assumes that the data is fundamentally linear and that the distance between the instances and the straight line is just noise, which can safely be ignored.
Some datasets work significantly in the Logistic Regression model, while for other datasets, the LTSM model works better, so what data you choose to decide the model accuracy.
How do you make a decision on which model works better?
The answer is to evaluate all models on a dataset and check which one works best.
Please comment below on your thoughts on the “No Free Lunch Theorem.”
Further Reading
Posts on Artificial Intelligence, Deep Learning, Machine Learning, and Design Thinking articles:
Rasa X Open Source Conversational AI UI Walk-through
Artificial Intelligence Chatbot Using Neural Network and Natural Language Processing
Code Example: Import EMNIST Dataset and Print Handwritten Letters
Forecasting a Time Series and Recurrent Neural Network(RNNs)
Pre-trained Models for Transfer Learning
EMNIST Dataset Handwritten Character Digits
MNIST Largest Handwritten Digits Database
Posts on SAP:
SAP AI Business Services – Document Information Extraction
SAP AI Business Services: Document Classification
SAP Intelligent Robotic Process Automation, Use Case, Benefits, and Available Features
A simple wireframe design for SAP FIORI UI Chatbot
Simplified SAP GTS Customs Export/Import Documentation with SAP Event Management
How to create your own SAP Fiori Chatbot in 10 days?
Preconfigured Visibility Process Scenarios in SAP Event Management – Part I
Why we like the SAP Business Rule Framework Plus (SAP BRF+) Recipe?
Leave A Comment