Is the IT Department being left behind in the 21st Century?
I was sparked into writing this article this morning after hearing an expression I thought we had moved away from. I overheard a couple of Senior IT guys at my client talk about “end user computing”.
This is an expression I haven’t heard since the 90’s or even the early 2000’s but with all the changes with devices and technology over last 10-15 years is it still a relevant concept?
This got me thinking about IT departments and the 21st Century. Has the march of technology overtaken the ability for the IT department to react and deliver technology solutions? Has the advent of digital given the business the opportunity to cut the IT department out of the equation and if they have what does the IT department need to do to reclaim the “technological authority”?
In the 80’s and 90’s we implemented massive organisation and productivity changing systems that impacted an entire organisation (without email and mobile phones in a lot of cases). Many of these system still exist with organisations and are deemed “heritage” or “legacy”. These needed a massive cohort to implement and then to support and this formed the bases of the IT department.
Large frameworks for the “management” of these projects and systems were developed and a governance structure put around the, ITIL; Prince 2 are two of the most popular. This led to longer delivery times and an administrative burden.
In parallel to these changes going on in the consumer space new technology was being introduced. Smart Phones, Netbooks, Tablets, Smart TV’s to name but a few. The internet that had ballooned during the late 90’s, and crashed, suddenly became mainstream and, as I predicted in the late 90’s the 4th utility.
All the while whilst this was going on the IT departments continued to get bigger and apply more and more frameworks and overheads along with outsourcing, what it saw as commodity skills which, were in fact, intellectual property (this will be the subject of a future article). What this meant was that to even have an idea to change a small bit of technology cost you thousands before you started.
It also led to the rise of the “shadow IT department” where end user departments were hiring, developing and supporting their own solutions.
Then we entered the digital economy. The rise of smartphones; tablets; ubiquitous internet via mobile phones and WiFi caused a perfect Nexus. Everything had to be digital. Again the IT department tried to impose a lot of the legacy disciplines onto the business which was rejected and as a result a new part of the business was created and called “digital” which meant the Technology Skilled staff worked alongside the business people as one to deliver what the business wanted.
Now we are deep into the new world. We have IT people looking after “heritage” and “Lagacy” systems and the Digital teams looking after the social media and digital world. Where does this leave the traditional IT department? My answer is well behind the curve.
People existing in the old world need to upskill and understand where the world has moved on to. Certain ideas and understandings need to be challenged and refreshed.
In conclusion there is no such thing now as “end user computing” everything is.
IP or commodity skill?
IP or commodity skill?
One of the trends that started at the end of the 20th Century and gathered pace in the 21st is the move to outsource certain parts of IT in the business to low costs providers.
This seemed a great idea at first as you could reduce, or fix your costs, for the more commodity skills. These included the provisioning of machines and desktop support of operating systems.
This also took its place in running data centres where support for standard machines and OS’s was seen as an off the shelf skill.
As the pace to outsource gathered more and more tasks were deemed to be commodity skills and candidates to outsource. Resources with many years’ experience if the various tools and there usage in the organisation were let go and replaced by these type of deals.
This is where the problems started to arise. Many of the skills that were chosen appeared to be a commodity skill but management failed to realise that once you deviated from the standard usage of the product you were in fact moving into the realms of intellectual property.
Take for instance support for ERP systems like SAP or Oracle amongst others. These are deemed to be fairly standard applications that you can churn skills out of training camps in a standard manner and then get them supporting business.
However we all know this isn’t true it isn’t what they are using that is important it’s the how they are using it that really matters. Company A and B both may have inventory and financials but the chances are they use the same package in totally different ways and on top of that the chances are they have both modified the core system in different ways.
So the concept of being able to get resources at the turn of a tap becomes increasingly more difficult because they also need to have knowledge specific to the particular company they are working for.
One of the major trends currently in business is the concept of nurturing your talent and the fact that those business that develop their talent will be the winners in the future so decision makers need to be very certain that anything they consider for outsourcing in the future is a real “commodity skill” to avoid losing all their Intellectual property.
As we move more and more toward the future and the true knowledge economy of high skill; high paid employment the distinction between the two will be more important than ever.