It is all about memory

Gill Eapen
2 min readJun 23, 2024

--

A recent article (1) argues working memory may be very important in learning and possibly better AI designs. The human brain with harsh energy constraints has been optimizing designs for nearly a million years. It preferred designs that minimizes energy use for obvious reasons. To do this, it has to minimize information transport distances and encourage local processing. For this, it has figured out how to hold more data in the working memory and those adept at this got selected over time.

There could be another attribute that may be important. The brain, in spite of the extremely efficient design, still consumes nearly 20% of the energy budget. It had to learn to make decisions with the least amount of data, unlike conventional AI designs that use as much data as possible. Having cheap computing power and memory has led AI researches astray, chatting and GPT aside, we are still much further from any semblance of intelligence. More importantly the path AI research is on, with the basic assumption that the marginal cost of a computing unit and memory is nearly zero, will always fail.

AI appears to be on a path to failure. Ironically, this is the result of a dozen companies with nearly infinite resources, splurging on their capital

(1) Can AI learn like us? | ScienceDaily

--

--

Gill Eapen
Gill Eapen

Written by Gill Eapen

Gill Eapen is the founder and CEO of Decision Options ®, Mr. Eapen has over 30 years of experience in strategy, finance, engineering, and general management

No responses yet