Big data in a historical context


Excellent stuff from Alan Patrick on his Broadstuff blog, talking about the 70s, 80s and 90s versions of big data – or “data”, they were calling it back then…

And you know what – you just cannot simulate the minute operation laden details of a shop floor or logistics network reliably. No matter how big your dataset, or your computers, or your machine tool onboard intelligence, there is just too much variability. Which is why the Just In Time/Lean movement came about as the better approach – the aim was to simplify the problem, rather than hit it with huge algorithm models and simulations so complex no one fully understood what they were doing anymore (just ask the banks what happens going down that route) – the aim of JiT/Lean was to actually reduce the problem variability, to get back to Small Data if you like.

Alan discusses the way that despite fascination with new technology and algorithms, the drumbeat that industry marches to is that of economics – in this case the pendulum swing of offshoring and onshoring, powered by the temporary advantage of emerging economies’ lower labour costs.

[….] It’s back to the future….I suspect they are now using bigger and bigger number crunching to eke the last 20% of improvements from the various kaizen projects ongoing, trying to keep the factories in situ as the Big Economics shift yet again

The rate of change today often feels bewildering at ground level, but keeping one eye on the forces of history and economics, we see ourselves in the context of slower moving, but more significant trends. In The Second Machine Age – which I’ve been fixated with over the last week (I even look dangerously close to finishing it) – the authors point out that

  • productivity gains from electric motors took about 30 years to emerge in manufacturing.
  • steam engines unlocked 100 years of productivity gains (and an exponential growth in human population).
  • microprocessors and the IT revolution unlocked meagre productivity gains until the late 1990s

What drove productivity in these instances was innovation that used the technology better – innovation in products, processes, organisation and management. When we’re looking at new technologies in our lives and workplaces like social computing, big data etc. it could be decades before their actual potential is felt by all bar the early adopters that are able to see their potential and change their mindsets and ways of working fastest.

Antony Mayfield
I'm Antony Mayfield - to find out more about me take a look at my LinkedIn profile (see the button on the home page). You can contact me by email at antony [dot] mayfield [at] gmail [dot] com. Google


  1. Your first link is missing an ‘L’ (

    I always like seeing the word ‘big’ removed from “big data” ;) Have ordered The Second Machine Age, sounds very interesting.

    With “Kryder’s Law” (cost of storage halves every 18 months), and “Moore’s Law” (no. of transistors doubling every 2 years), yesterdays “big” becomes tomorrows “small” quite quickly. The big problem with using “big data” software solutions is you are using technology that is more suited to Google, Facebook, NASA etc, when often all you need is a well optimised SQL database. There are some cases where “big data” software solutions are better, however they often take longer to implement, are more costly, to get the same outcome.

    Developers will always want to use the latest and greatest technology (the best people will), however they often need reminding of the commercial implications. SQL has been around since the 70s (40 years this year!) so the number of people around who understand and are skilled in SQL is huge compared to those in “big data”. SQL is also one of the few languages that is consistent across different providers (if you know MySQL, chances are you can work with SQL Server, PostgreSQL etc).

    Often with technology there’s a lot of putting the cart before the horse. The best Engineers, in whatever field they are in, take scientific knowledge and apply it to real world problems. If the solution they come up with, is more complicated than the problem, then they’ve failed at their solution.

    Technology should make our lives easier and better, otherwise, what’s the point?

  2. Pingback: Toko Pasutri

Leave a Comment

Your email address will not be published. Required fields are marked *