You never really know what technologies will win in the market place

Unless you have hindsight :-)

For a few decades, I have used Lisp map and reduce functions (map*** and reduce functions in Common Lisp and Scheme) and more recently the equivalents in Ruby (with the niceties of using code blocks).

Who would have predicted how important this pattern would be for scaling data crunching? Recently I have been getting (back) into Hadoop (a very high quality open source implementation of Google's file system and parallel map/reduce framework that lets you add you own map and reduce functions and not worry much about the scaling infrastructure) and also CouchDB that implements structured data views of partially structured data using map/reduce.

At the time (many years ago), I thought that Connection Machine style parallel data crunching would take over the world (*Lisp was very cool) but I was wrong about that: the Connection Machine relied on expensive proprietary hardware, and as often happens, technologies or markets don't develop until the price gets squeezed down.

Anyway, Alan Kay famously said that the best way to predict the future is to invent it, but for most of us there is always hindsight :-)


Popular posts from this blog

Custom built SBCL and using spaCy and TensorFlow in Common Lisp

I have tried to take advantage of extra time during the COVID-19 pandemic

GANs and other deep learning models for cooking recipes