1. Business does not know itself.
Every organization struggles with the problem of data. Either it's too much data, or not enough data.
In the case of too much data, the analysts and the IT teams struggle to make sense from the barrage of data. A lot of times, IT is forced to archive data, and by so remove it from analysts access. In other cases, large amounts of data is simply stored, but no viable reporting is available, or only a small segment is reported on.
In the case of not enough data, local analysts are unable to get complete pictures because data is stored in multiple systems, and a lot of times, is localized to regions. This means that organizations struggle to get complete pictures. Imagine what that means for financial risk: What is my exposure to Bear Sterns? I don't know, it will take a few days to compile the report from various places. Don't believe me, this was the case at Lehman, and is the case at a Global Asiatic Bank. I would also bet, is the case, in almost very other institution.
2. Business doesn't trust IT to deliver.
The business simply does not not trust IT to deliver the solutions it needs. When solutions are delivered, they become large monolithic entities locking the organization in, and becoming an expensive constant expense. Any change to the systems, becomes an expense at times larger than the cost to build the system in the first place. In other times, IT simply looks like a dog chasing it's own tail. They move very fast, and have short time-lines, but the end products are not even remotely close to what the business wanted. IT seems to always have an excuse. I was once in a 3 hour meeting. The business analyst wanted account numbers. He wanted a system where he can debit one account and credit another account. For that, he needed account numbers, and amounts within those accounts. The developer for the system was trying to explain that the analyst didn't need accounts, but only had to enter the transactions and attributes of the money movement. The system can then aggregate those transactions in what ever way the analyst wanted. You see, the system was designed generically, and adding account numbers would intrude on that design. And, so they went, back and forth. Neither understanding the other. The end result, nothing. No account numbers. It was deemed that this feature wasn't go live critical, and can be addressed in the next phase.
The fault is actually neither IT nor the business. The problem is more basic. The business is very agile. It's able to move quickly and easily. If you've ever read a legal document or saw financial models, you'll understand. They can be very complex and very nuanced. Most technical architectures are not build to be nuanced. They are built to solve concrete Boolean problems. IT wants the problem defined in concrete terms, but the business is only able to articulate it's current understanding of the world. Unfortunately, tomorrow it will be a new understanding. And so, IT builds a monolith, because, that's what it knows how to build, the business changes, and the organization suffers.
This really saddens me. Technology is capable of solving so many problems, but instead it's relegated to addressing the most basic of operational problems, and even there, we struggle. Consider all the technical possibilities, neural networks, expert systems, machine learning, data mining ....
3. Business data is not clean.
So much of data is simply bad. Imagine how complex some of the systems are, and then imagine the possibility of an error. An error that doesn't generate an exception, perhaps, rounding, perhaps, a logic error, perhaps, an unforseen condition. The output data becomes bad, and so it flows through the system. A lot of times, enterprise architectures don't even do basic reconcilliation. Reconcilliaton requires active design, time and thought. Most architectures are organically produced, and are feature driven rather than any thoughtful design. The end result is frankenstein architectures and garbage data.
4. There are a lot of manual processes.
How many times have you created a new system, which actually creates more manual processes than it removes. New systems sometimes require users to enter data in multiple places, verify multiple places, etc... Some firms have massive operational groups, hundreds of people. Their sole job is to do what systems fail to do. Enter data in multiple places, reconcile, data entry, data messaging, normalization, etc... This is a horrible model. It's extremely error prone, not too mention expensive.
5. Heavy reliance on people.
I believe that machines and people have their roles. Unfortunately, a lot of jobs done by people to day, should be done by machines. Machines and people need to find a harmony. Some things we do very well, other things, machines. Sifting through large amounts of data should be the role of machines. Alerting us to unforseen circumstences, should be machines. Allowing us access to data, should be the role of machines.
6. Presentation
So few firms give through to the presentation tier. In most cases, it's an after thought. Some UI's are desktop, other's web, other's, something else. The end user is required to memorize what features are available where, what report has what, etc... God forbid, we standardize and unify the disparate systems. The user should have a single place to go, a single way to do something. The learning curve to learn how to use everything should either not exist, or be extremely small.
Saturday, January 10, 2009
Subscribe to:
Posts (Atom)