A Change of strategy?
When is a change of strategy due and overdue? In the last few days I had the luck to have a donut time with a brilliant colleague and we actually had quite a broad conversation about the present and future.
Beside facts, what’s changing? The fact itself, or even a few facts, might not be trend setters and might not justify a change in strategy. Strategy is a deeper take into behaviour, a strategy is made to address one or many situations so that the outcome can be controlled.
As perfection isn’t a state but rather a direction, your strategy can’t be either. It can still be viable even when it has a low efficiency.
If your investment strategy would be right only 33% of the times, that could still be turned into a big win.
My point is that the strategy breaking point vary, many strategies certainly deteriorates over time. This is true for something as trivial as remote working: a discipline pretty much underdeveloped and underused until COVID, where now most of us pretend we master it, but very few really do.
Education strategy is a big failure as well: the jobs I ended up doing didn’t have a corresponding academic path when I was young. At the current speed we might well be preparing our younglings to do jobs that AI will do, if somebody will have to do them at all.
Big break ups of society, in history, with strategies have led to sometimes turmoil and disorder (think equality), other times it’s just subtle: what you needed to do as a businessman in the 60s to feel taken seriously can be different of what you have to do at the present moment, especially if you work with modern tools. When a generation break up with a traditional strategy, that’s big.
My job as a developer is certainly a niche if debated within the perimeter of these bigger dogs, however I see clearly impacts of AI in software development.
A sizeable part of investments in modern big scale companies is sometimes about working at scale, including reusability and collaboration, really the big question is “how do we get from where we are to where we should be without crashing the system, with this large amount of people working simultaneously?”
It’s an important question for humans in general, to be honest: for couples, for new joiners in a company, for car drivers.
If AI could do the work, would we care how it’s done? I’m not sure I would, if it complies with foundamental rules with which we can frame AI freedom.
With a metaphor let’s say we want to get two folks, Giuseppe and Dejan, from one of my favourite fancy restaurant - Ivy Asia Mayfair - to LCY (it’s a posh airport I know, but those are cool people) - and we want to bring them there safely, well rested and on time, with all their luggage.
If AI would be able to build and test a very large amount of transportation systems and simulate - quickly- but with a very high level of accuracy, if the transport would fulfill all directives given above… would we really care what would the transport look like, or if it has the ability to transport three people, instead of two?
I don’t think we would, because the moment we need to transport three people we can just ask AI to create a vehicle, again.
Within the limits that the current AI offer, we are witnessing the sunset of the culture of the HOW [it’s done], next will be the sunrise of the culture of WHAT - it will matter very little how it’s done, as long as it delivers what was requested.
It’s a privilege to be here withstanding this incredible moment in humankind.