Today in 1836, Charles Babbage wrote in his notebook:
This day I had for the first time a general but very indistinct conception of the possibility of making an engine work out algebraic developments. I mean without any reference to the value of the letters. My notion is that cards (Jacquards) of the Calc. engine direct a series of operations and then recommence with the first so it might perhaps be possible to cause the same cards to punch others equivalent to any given number of repetitions. But their hole [their holes?] might perhaps be small pieces of formulae previously made by the first cards.
This passage, says Brian Randell in The Origins of Digital Computers, “puts beyond doubt the fact that Babbage had thought of using the Analytical Engine to what would today be described as ‘computing its own programs.’”
In 1952, Arthur Samuel developed the first computer checkers playing program, the first to learn on its own, and in 1959, he coined the term “machine learning,” defining it as “programming of a digital computer to behave in a way which, if done by human beings or animals, would be described as involving the process of learning.”
In 1953, reacting to what he called “the fuzzy sensationalism of the popular press regarding the ability of existing digital computers to think,” Samuel wrote: “The digital computer can and does relieve man of much of the burdensome detail of numerical calculations and of related logical operations, but perhaps it is more a matter of definition than fact as to whether this constitutes thinking.”