Hi everyone,
In the last couple of days I had the opportunity to normalize a DB. :)
Let me show you a screenshot about a single table:
As you see a lot of data is repeated across the rows. Therefore, I had some bad thoughts about those who let the DB become like this in the beginning. I didn't wanted to do it. As the moments were passing away I let the challenge to excite me. At the end I did it.
Before doing anything the total number of rows in the db was above 100000 and the size of the table was 10,5 MBytes.
My first step was to extract the name of the components into another table and reference them.
This way the size of the table decreased to 8.5 MBytes. And the new table size was 1.5 Mbyte. I earned 0.5 Mbyte, not too much, but the database was much more cleaner.
My second step was to transform the check/change data represented by a string into their numerical representations, this way enabling the database to be internationalized. By doing this the table size decrease to 6.2 MBytes. That means 72% of its original size and the data can be internationalized and we have all the benefits of a normalized DB schema (that means no update problems and a joy to work with).
I also measured the speed of joins operations. Because of the normalization the same data retrieval was increased by 3 times. I put an index onto foreign keys. And the speed was just like before. :)