I work in a data-modeling environment with the responsibility of logically modeling (LM) data for mainframe and server databases. LM for the OLTP systems are done to 3rd NF. Recently, the DBA recommended LM for localized data in a much less rigid format, i.e. less or potentially no normalization.
Are you aware of any discipline of LM for localized data? If so, can you recommend some reading on this topic?
Not only am I not aware of any resource that would advocate logical modeling of data in an unnormalized fashion, I would be against anyone reading it if it existed. The logical data model should be normalized. Normalization is the process of identifying the one best place each fact belongs.
Normalization is a design approach that minimizes data redundancy and optimizes data structures by systematically and properly placing data elements into the appropriate groupings. A normalized data model can be translated into a physical database that is organized correctly.
Normalization was created by E.F. Codd in the early 1970s. Like the relational model of data, normalization is based on the mathematical principles of set theory. Although normalization evolved from relational theory the process of normalizing data is applicable generally, to any type of data.
It is important to remember that normalization is a logical process and does not necessarily dictate physical database design. A normalized data model will ensure that each entity is well formed and that each attribute is assigned to the proper entity. Of course, the best situation is when a normalized logical data model can be physically implemented without major modifications. However, there are times when the physical database must differ from the logical data model due to physical implementation requirements and deficiencies in DBMS products.
Take the proper steps to assure performance in the physical database implementation for the type of applications you will need to create, the service level agreements you will need to support and the DBMS that you will use (I assume DB2 for z/OS since this question was asked in the Search390.com portal). This may mean "de-normalized" by combining tables or carrying redundant data (and so on) but this should be undertaken for performance reasons only. And, the logical model should not have any of these "processing" artifacts in it.
Editor's note: Do you agree with this expert's response? If you have more to share, post it in one of our
Dig Deeper on IBM DB2 management
Our expert suggests that the best way to pass the SAP BW exam is to have hands-on job experience. But that's not all he has to say about ...
To export data from a DB2 table to a flat file, you need to run an export specifying the proper file format. The export utility exports data from a ...
Craig Mullins recommends two specific resources for learning how to create and support Binary Large Objects (BLOB) in DB2.