Data Cleansing

The human brain is surprisingly good at completing imperfect, slightly deviating or inaccurate information to familiar patterns and then recognizing it. “Thomas Smith, living in ABC-Street” and “Tohmas Smth, living in ACB Street” will quickly be identified as probably equal with some slight typing mistakes. Most computer programs, however, will identify these two gentlemen as two completely different persons, living in two different locations.

 

The disintegration of the key account

In the hectic rush of the daily business it happens fast. A new order needs to be entered, due to a small typing mistake in the search mask the already existing customer master record cannot be found, a new one is quickly set up, albeit with a slightly different spelling.

 

This small negligence, happening repeatedly, has far reaching consequences. The same customer is entered into the system several times, i.e. there are many small customers instead of one key account.

 

„Garbage in, garbage out“

This old IT proverb is another way of saying that the quality of the data processing output depends mainly on the quality of the data processing input, i.e. in large parts on the master data. Taking the example of the shredded key account, this becomes apparent right away. If – on average – every customer is entered into the system only twice, the whole customer master data doubles in size. Search requests (e.g., last order, last delivery or last open item for a particular customer) become more difficult, time consuming and error-prone.

 

Business Intelligence or Business Stupidity?

The term “Business Intelligence” (BI) signifies the structured analysis of big amounts of data relevant to business with the help of statistical methods. The reports generated with the help of BI are important tools for the management of companies. Obviously, for the conclusions drawn from the BI reports to be correct, it has to be ensured that the information in these reports is actually valid. And again, our shredded key account comes into play: Polluted master data leads to polluted reports. Very early in the process, very reliable, very unpleasant. Depending on the degree of pollution of the master data, business intelligence is thus becoming business stupidity.

 

Individually configurable framework for data cleansing from Systrion

Over the years, Systrion has built up extensive know-how in the area of master data cleansing. This know-how left its mark in a couple of software programs that are available as a framework. This framework enables us in new projects for our customers to very quickly implement new solutions and to come to productive results.

 

With our programs our customers find and remove existing doublets in their master data, and make sure that in the future no new ones are created. Data hygiene at the beginning of data processing ensures that the analyses at the end of data processing really generate business intelligence.

Go back