Monday, March 31, 2014

You’ve Got Data-mail

Since the dawn of mankind, there has been a need to relay messages between people who are unable to communicate directly to exchange information. This has now evolved to an extent that often, the two individuals who effectively communicate, do not even know who the other person is. Basically, the communication has become a product of an indirect relationship based on social and economic norms.


But what has not changed, is the basic building blocks of the communication. There is a medium (channel), language (protocol), and various steps in disseminating the information from one end to the other. Now here is the interesting part: we have done this with messengers, smoke signals,  postage mail and electronic mail. There is a common, fundamental set of principles here, that has never changed, and in my humble opinion - never will.


For this reason, I believe that by using a simple set of concepts and their relationships (aka a model), one can and should be able to describe any system of information exchange. This will result in simplification of data management by trivializing the reference framework. My personal preference for such a model would be to reuse the concepts already used in one of the most classical forms of delivering information, specifically -  the postal service. You can then apply the framework to any system of information exchange, by identifying how the teams, systems and processes map to the “information postal service”. This will further support the evolution of governance a data quality control practices.


I see this model of comprising of three levels, namely: business contract (relating those who HAVE the information with those who NEED it at an agreement level); Information Services layer, which underly the steps in delivering the data (think mail delivery services); and finally the service management goals, comprising of the parameters, or sensitivities needed to to be managed in order to  ensure that the services operate efficiently to deliver the appropriate level of quality.


Other terms that come to mind include: “posting”, “packaging”, “gathering”, “sorting”, “distributing”, “delivering” and “collecting” the data. As I mentioned, what this means in your information exchange system will depend on how you design, configure and run the “system”.


So the next time when you send or receive an e-mail, or post a message on someone’s social media channel,  just think for a moment how your information HAVE (or NEED) relates to other forms, formats and volumes of information exchange.


End of transmission...

Saturday, March 15, 2014

Why Manage Vertical Data Lineage

Vertical data lineage refers to the alignment, appropriateness and visibility (in other words the health of the connection) between the physical data models, the technical capability and the business processes and objectives across the entire enterprise architecture stack (see TOGAF). While technology lives solely for the purpose of supporting the business, too often it is out of touch from what is really needed by the business.

There are several reasons for this ailment, including lack of strategic planning (resourcing), subjective decisions over technical capabilities (politics) and poor management (skills). These sound like great areas to work on in order to evolve your business, but let's review how those issues affect the optimal use of information in the business.

When you choose an inappropriate technology, or method, particularly on the data management side of things - you increase you data-business distance. This means that you have weaker control on how your data supports your business. Not only is your business struggling to get the right information to the right people on time, the technology group struggles to try and fit square requirements into triangle-shaped technical solutions. This in turn increases what I call “data wrinkles” (work-a-rounds which are cumbersome, unnecessarily and expensive data flow solutions) which then lead to a natural increase in risk and operational costs. A classic example would be managers deciding to migrate data to a new platform based on limited and/or subjective view on the solution’s capabilities and true ability to answer the business requirements (and in case you didn’t know: migrating to a newer platform does not constitute a business requirements). It is more likely, in these cases that the related issues stem from problem is processes design or gaps in data governance.

In order to ensure your technology is driven to support the information requirements of your business - you need to task someone with precisely this task. This might sound like a trivial statement, but do you actually have someone in your business who has this objective on their performance contract? This role involves understanding the information needs of the business and ensuring that processes are designed using well-selected data models. The models need to be adopted by everyone in the business to minimize data entropy and the data wrinkles, and the technology decisions need to be in-line with the technology, business and data strategy.

If you do not have someone in your business acting as an Enterprise Data Architect - I would strongly recommend you get someone assigned to these duties.