"Learning is any change in a system that produces a more or less permanent change in its capacity for adapting to its environment"
Herbert Simon, The Sciences of the Artificial
Now more than ever, business tastes better if you put technology in it - the tasks get done faster, more accurately, employees can delegate more left-brain activities to their computing devices and can focus more on right-brain activities that nurture their humanity more. Who knows, they might even start to like their jobs.
Now let's shake off this rosy picture because, just like with any deep change, adding technology to a business model introduces its own set of challenges, like security, vendor lock-in, task commoditization etc.
But the biggest challenge is the added complexity layer.
As the world around us explodes into a myriad of perspectives and new voices that were, until the advent of ubiquitous connectivity, unknown to us, our quest for technological advance exposes us more and more to this new, hyper-aware, world.
Let me give you an easy example:
1/ you build a website for your company...
2/ then you decide to turn it into a portal that manages customer and vendor workflows (appointments, customer service, MDM)...
3/ then you want to integrate those workflows into your IT landscape...
4/ then you want to start proactively managing your external stakeholders...
5/ then you realize you need social media connectors...
6/ then you start analyzing social feeds in order to manage your social media risk...
7/ then you decide that your website should really be a web app...
8/ then you want that app to be managed in the cloud...
9/ then you want to outsource the app development lifecycle...
10/ then you want to mitigate the vendor and cloud risks...
And of course you need people to manage all this new complexity and they need to interact in new ways that need to be managed as well...
This brings me back to my early days (when artificial intelligence -AI- was a bit more than an exciting academic topic) when I was researching AI techniques for the forecasting of financial time series. We were back then using as main tool a custom-built back-propagation neural network (NN) and we were struggling to optimally size the number of layers in order to give the NN the maximum predictive capability.
For those new to this, NN's are software programs that mimic the way the human brain learns (by altering the chemical information in the synapses - the links between living brain cells), by non-linear error-correction algorithms that alter the weights of the connections between software constructs that imitate brain cells.
Nowadays, NN's have gone mainstream and are at the core of any pattern recognition software: OCR, photo and video recognition, voice recognition (Siri, Google Now, Cortana etc).
The training procedure for a NN is that you train it on a dataset, then you test it on another similar (but different) dataset, then you validate it on another similar (but different) dataset. The performance in the validation phase measures the NN's predictive ability.
As we built the NN, we could witness what was called in the literature "the curse of complexity" - as we added more layers, the NN just became dumber and dumber - the validation performance was dropping abruptly. This seems counterintuitive - you'd expect that, as you add more layers with more neurons and more connections, the NN gets smarter.
It turns out that, when you have too many layers, the NN overlearns the training dataset, to the point of learning the data itself rather than the underlying non-linear equation that generated that data.
To draw an analogy to business, when you have too much complexity embedded in your business, your organization tends to learn best how to manage the current set of activities, but without being able to anticipate new business opportunities and without being able to closely follow the business strategy as set forth by its leadership.
So, how does a business entity reconcile the unnatural need for it to grow (i.e. increase complexity) with the natural need that it remains within the computational limits (or bounded rationality, as Herbert Simon calls it) of its decision making layer (i.e. contain complexity)?
There are no easy answers to this question - large corporations have gone through great pains in the 80's and 90's to keep the business manageable by establishing strict and rigid business rules, by delegating decision making abilities, and by deeply embedding business logic into systems and procedures. This approach, however, brings its own set of complexity, leading ultimately to ossification, which is the enemy of adaptation and economic survival.
My personal belief is that the answer revolves around the enterprise's ability to upgrade and pivot its business model based on the changing environment.
Enterprise strategists need to develop an uncanny ability to always ask themselves the excruciating question:
"If I add this to my business, what else should I remove in order to keep it sane?"