Importance of Architecture in Business Strategy
Information Technology pervades nearly all aspects of business ecosystem today. However, despite increasing use and proliferation, the state of IT is that it is getting increasingly complex and is only known to be slowing things down. This is especially true in the enterprise software space which, ironically, is growing to be ever so pervasive. Business strategy does not live in isolation. Speaking of expediting business strategy execution while not attending to flexibility in the architecture construct is a lost cause.
The complexity of any structure, be it physical or software, relates directly back to its architecture. The lay person sees the building’s façade, elevation, layout, etc. However, what provides strength to the building, giving it tenacity and resilience, is the underlying architecture. Just like it is hard for the lay person to understand the value of the underlying structure, architecture is a critical element of a complex ecosystem that supports the building. This is even more true of software ecosystems.
We qualify architecture by its dexterity.
is a noun – indicates nimbleness, adroitness and agility – and is NOT a term that most IT portfolios are generally associated with.
There is growing misconception that throwing more hardware at a problem will resolve it – faster processors, more memory, virtualization, coolers, network optimization software and then optimization tools to optimize the optimization software and so the story goes… Moore’s Law is turning out to be a disadvantage. I believe that constraints – of any type – are a good thing. They drive the need to be optimal. Nothing that is found in plenty is much respected – and this, in my opinion, spans the spectrum – from human talent to food to metals.
Not too long ago when the industry would operate off memory in single digit bytes, and 10MB of storage would be considered a lot – and also be very expensive. The Intel 8086 (1978) could address 64KB of memory, have 1MB of physical memory and was rated to run between 4-10Mhz. Mnemonic codes would be able to drive the processor and memory had to be managed. There was no slack. Code had to be terse and at its best. Processor cycles were expensive and hence had to be made use of in the most optimal manner.
In the days of the Victorian Internet (the Telegraph), transmittal of a character was expensive – and restricted. Hence short terse messages were written and early on it was only used for the most critical messages. Over just a few decades, processors are faster, memory is much cheaper and compute capacity is available as a utility. The commoditization of such compute has led to a lack of respect for the same. I feel it is this very commoditization that is partly to blame for InfoTech’s current day woes. Bloating code bases and data stores have yielded very little other than bloated IT portfolios – and because they don’t cost much (or at least so we think), this situation has continued to grow.
Cross-wired systems and a lack of respect for architectural integrity is another reason for this situation. In some ways, the notion of architecture has arrived late into the systems space.
A large part of the solution to IT’s issues lies in architecture – and it has to do with what I like to term as ‘architectural dexterity’.
For IT portfolios to bear any trait of velocity, at a time when velocity is of the essence, maintaining dexterity is absolutely essential. How? … is a topic of another post.
Keep watching this space.