Big data in a historical context

ZZ0012D5F8

Excellent stuff from Alan Patrick on his Broadstuff blog, talking about the 70s, 80s and 90s versions of big data – or “data”, they were calling it back then…

And you know what – you just cannot simulate the minute operation laden details of a shop floor or logistics network reliably. No matter how big your dataset, or your computers, or your machine tool onboard intelligence, there is just too much variability. Which is why the Just In Time/Lean movement came about as the better approach – the aim was to simplify the problem, rather than hit it with huge algorithm models and simulations so complex no one fully understood what they were doing anymore (just ask the banks what happens going down that route) – the aim of JiT/Lean was to actually reduce the problem variability, to get back to Small Data if you like.

Alan discusses the way that despite fascination with new technology and algorithms, the drumbeat that industry marches to is that of economics – in this case the pendulum swing of offshoring and onshoring, powered by the temporary advantage of emerging economies’ lower labour costs.

[….] It’s back to the future….I suspect they are now using bigger and bigger number crunching to eke the last 20% of improvements from the various kaizen projects ongoing, trying to keep the factories in situ as the Big Economics shift yet again

The rate of change today often feels bewildering at ground level, but keeping one eye on the forces of history and economics, we see ourselves in the context of slower moving, but more significant trends. In The Second Machine Age – which I’ve been fixated with over the last week (I even look dangerously close to finishing it) – the authors point out that

  • productivity gains from electric motors took about 30 years to emerge in manufacturing.
  • steam engines unlocked 100 years of productivity gains (and an exponential growth in human population).
  • microprocessors and the IT revolution unlocked meagre productivity gains until the late 1990s

What drove productivity in these instances was innovation that used the technology better – innovation in products, processes, organisation and management. When we’re looking at new technologies in our lives and workplaces like social computing, big data etc. it could be decades before their actual potential is felt by all bar the early adopters that are able to see their potential and change their mindsets and ways of working fastest.

The fog of revolution: social media trends 2006 & 2012

ZZ1AE357E7

Thanks to the brevity and immediacy of Twitter I have already Tweeted saying everyone needs to read the sources of inspiration for this post. So you’ll forgive me for opening with some tangentilish thoughts…

Or maybe you won’t.

One of my favourite observations about change and the web is what I call “the fog of revolution”, a phrase that became very popular last year in a different context. When you’re in the middle of a revolution it is very hard to know what’s going on, not least when there are so many voices close by telling you exactly what is going on, and generally being very wrong. (more…)