I was at an event the other day and got talking to some young developers. They were interested to hear my views on the ‘new world of work'.
As strange as it sometimes feels, I’ve now been working in IT for two decades. So from time to time, I’m asked for advice from those at the beginning of their journey. There is always one thing I say.
That is: the industry is always changing, so stay agile. The specific technologies that you learn at university will rarely define your career, and almost certainly not beyond your first job. Technologies will come and go, so the key is to keep learning – and never stop.
Depending on how you count them, I learned a baker’s dozen of programming languages at university. All but a couple of these are now obsolete, and I’ve never really used any of them in a paid capacity.
Today’s new hotness will be lucky to survive long enough to become tomorrow’s embarrassing legacy tech.
Collaboration over conflict
As the conversation continued, I was shocked at how savvy the young engineers were about the structure of the organisations they might work in.
When I was fresh out of university, my colleagues and I tended to focus on the tech and ignore the “suits” until we were forced to pay attention to them. This attitude, combined with the borderline contempt for “lusers” (that was commonplace), set up a lot of unnecessary conflicts. Techies assumed that management decisions were explained by ignorance and malice in varying proportions.
The young developers I spoke with hadn’t so much as heard of the (dated and sexist) archetype of “beards vs. suits”.
A real shift in management
This more positive attitude really is a sign of the times. A recent survey by MongoDB found that younger developers are much more likely to think that IT decision-makers and developers are aligned when it comes to IT decisions.
Why? IT management as we know it has dramatically changed (and will continue to do so).
Managers today are required to be much more technically savvy than even a couple of decades ago. All because the developer has become mission-critical.
We’re now in an era where software is everywhere. Every business has it at the core of its offerings and practices. Often, better software is the unique selling point or at least a major differentiator.
As such, the very nature of IT decision-making processes has shifted to involve much more collaboration between developers and management. The speed with which the landscape is changing has made disconnected decision-making almost impossible.
Beyond these technological shifts, huge organisational and process changes have also taken place. The old command-and-control style of management which made techies feel isolated from decisions made by suits has gone.
Today, if it takes eighteen months and dozens of people to see a concrete result from a project, inevitably something isn’t quite right. When a couple of developers can whip up a minimum viable product, publish it for pennies, and start running A/B tests gathering feedback, any decision process which does not start from this assumption needs looking at.
After my chat with the young developers, I was pleased to see them stepping into this new world of work. I just hope they too get the backing of management and don’t get let down by empty “agile” promises.
Real change for real results
Only those organisations and individuals that embrace these new realities and stay agile will prosper in the future that is already all around us.
Cosmetic claims to be “flexible” and “collaborative” will be seen through instantly the first time someone actually tries to engage with the process.
That’s not to say that there should not be a process, just that the process has to be at least as agile as the organisation says it aims to be.
My young friends don’t have any wrong assumptions to unlearn. All they have to do is to remain open-minded, agile and willing to learn. There’s an exciting opportunity out there and I’m curious to see the world they will build.