Induction and Learning

Expressed in the inductive form, the same set of assertions about Socrates would be:

Observation:

Observation:

Rule:

Socrates is a man

Socrates is mortal

Therefore All men are mortal

The same rule could be epxressed graphically.

 

 

The exclude middle is Socrates, linking Men and quality of being mortal. But clearly there is a problem here. There is an implicit assumption that Socrates represents all men or more explicitly, "Man is Socrates". We know that is not true but, nonetheless, this sort of inference seems to occur quite often in everyday language as over-generalizations. In fact, we could equally associate "men and "mortals" in the other direction and conclude that "All mortals are men", based on the sole example of Socrates.

What does this mean ? One way to interpret it is by reading the conclusion as "Based on what we know about Socrates, there is no reason to reject the assertions that "all men are mortal" or that "all mortals are men".

Disproving the notion that "all mortals are men" is an easy task - the first mortal bird or squirrel encountered along the highway would disprove it beyond doubt. However, the only way to prove finally and forever the mortality of all men is to examine every human on earth ( including Pythagoras ) and determine the presence or absence of mortality in their set of personal characteristics. Clearly. it is an impossible task since there would always be the possibility of someone being born who just might disprove the ancient assertion.

Beyond Any Doubt ?

In this sense, induction can never prove anything beyond all doubt, only very high degrees of likelihood. The mathematical laws of probability can set an upper limit on the likelihood of error, ( assuming that the experimental method was correct, which is not always the case ). In the case of the assertion that "all men are mortal", it is as certain as any truth we know. Even as rough 'guesstimate', one can say that the chance of error is no more than one in many billions ( the 'true' probability of error is much smaller, of course ).

The Scientific Method

Induction by experimentation is the basis of the scientific method and knowledge discovery. Most rigorous statistical and numerical-intensive techniques fall into the inductive mode. There are many effective models and methods for inductive learning, such neural networks and various types of associative networks. The results of these models may be difficult to interpret, but they are easy to use these days.

An important feature of inductive inference is capability to measure precisely the degree of error in associating data from a large population with the generalized rule. This is in stark contrast to the abductive mode of inference.