16 October 2023
by
Hung Tran – Webscope.io
Tran Manh Hung

Database Normalization: A Simple Tale of Ignoring 1NF

database
sql
Let's dive into a quick chat about database normalization. If you've heard of it, awesome. If not, do not worry. This little story may shed light on why it's important.

Database Normalization in a Nutshell

Picture this: you're trying to organize a big box of mixed-up Lego bricks. You'd probably sort them by color, size, or type, right? Database normalization is like that but for data! It's a way to sort and store your data neatly to avoid mix-ups and make sure everything is easy to find. For those who fancy a textbook definition, Wikipedia tells us:
Database normalization is organizing a relational database to reduce data redundancy and improve data integrity.
In simpler terms: it's a set of rules (or "normal forms") that helps keep your data organized and your database running smoothly. However, the smoother your data, the trickier it might be to pull specific pieces when you need them.

Bypassing 1NF: A Shortcut Gone Wrong

What this article will be a real story I have encountered in the past with one of the projects, where we tried to be clever and decided to ignore the first normal form (1NF) to "simplify" our database.
Just to clarify, 1NF says:
Every field should contain a single value, no sets of values or nested records allowed.
In other words: "Don't put tables inside of tables." But, of course, we did just that—tucked a table into another and found ourselves in a mess of complications and headaches.

The story

Our journey commenced with the seemingly straightforward task of crafting a mechanism to store varied types of content – encompassing charts, custom HTML content, and checklists. The initial thought process nudged us toward the creation of a single table, structured as follows:
1create table [dbo].[ContentSelection]
2(
3	[ContentPartSelectionID] int identity (1, 1) not null,
4	[ConfigurationJSON] nvarchar(max) null,
5	[ProfileTabID] int not null,
6	[ContentType] int not null,
7);
A singular majestic table, ContentSelection, was meant to graciously host various content types - charts, HTML blobs, or checklists, all neatly serialized and tucked into the ConfigurationJSON column. Oh, and a polite ContentType column to nudge us about what was packed inside each JSON, making deserialization ez thanks to .NET wizards like System.Text.Json and Newtonsoft.

Problem 1: Navigating the Puzzle of Configuration Updates

Oh, the joys of updating configurations! Especially when dealing with JSON, it turned out to be a notable quest in our database management journey. Even seemingly simple tasks, like tweaking a chart's Title attribute, uncovered some pretty significant hurdles embedded in this method.

Simple Title Update in code: A Walk in the Park

In the world of coding, this task seemed like a walk in the park. The steps of deserialization, executing the update, and saving the JSON again could be done in a few simple steps:
1var data = JsonConvert.DeserializeObject<ChartConfiguration>(chart.ConfigurationJSON);
2data.Title = "New Title";
3chart.ConfigurationJSON = JsonConvert.SerializeObject(data);
4// And the chart is saved!
This snippet of C# code allows developers to change the Title without diving deep into the entire JSON string. However, as the application goes, the code was all over the place, and this deserialization and serialization were done in many places. This is a code smell; we should have done it in one place. Sometimes, we just took the whole string and saved it as it is. I just trusted(hoped) that the client is sending the configuration in the correct state. Yeah, but that's a different story.
Banner

Do you need a reliable partner in tech for your next project?

Simple but Not So Simple: The Database Update

Trying to update the configuration JSON directly within the database manually gave us a few options, each with its own set of complex puzzles:
  1. Whole JSON String Update:
1UPDATE [dbo].[ContentSelection] 
2SET [ConfigurationJSON] = '{"Title": "New Title", "Type": "Chart", ...}'
3WHERE [ContentPartSelectionID] = 1
Surprisingly, updating the entire JSON string, while seemingly straightforward, demands a meticulous understanding of the existing data structure to prevent any accidental data adventures (or misadventures).
  1. Making Friends with SQL Server’s JSON_MODIFY Function:
1UPDATE [dbo].[ContentSelection]
2SET [ConfigurationJSON] = JSON_MODIFY([ConfigurationJSON], '$.Title', 'New Title')
3WHERE [ContentPartSelectionID] = 1
The JSON_MODIFY function seemed like a neat and tidy update approach. Yet, it introduced a new set of complexities, particularly for those developers who might not be best buddies with SQL JSON manipulation functions.

HTML Property Challenges: An Additional Twist

And then, things got particularly interesting when dealing with properties that encapsulate entire HTML strings. Direct updates weren't quite in the cards due to the necessity of escaping quotes to keep our JSON string in good shape. Thus, what seems like a straightforward update morphs into a careful dance of ensuring every character is correctly escaped and validated, amplifying the possibilities for error and inefficiency in database management.

Problem 2: Leaping Over Initial Data Insertion Hurdles

Inaugurating a new production instance entailed inserting a wide variety of content—charts, checklists, and the like—into the ContentSelection table. What seemed like a straightforward task was, in reality, a hidden obstacle course.

The balance between Quantity vs. Quality

Inserting many rows into the ContentSelection table meant wrestling with large strings of JSON configurations. Each piece of content, whether a chart or checklist, demanded a unique, error-free JSON string to guarantee smooth sailing in functionality and display.
Example JSON:
1{
2  "Title": "Sample Chart",
3  "Type": "Chart",
4  "Configuration": {
5    "Type": "Bar",
6    "Data": {
7      "Labels": ["January", "February", "March"],
8      "Datasets": [{"Label": "Monthly Sales", "Data": [30, 25, 45]}]
9    }
10  }
11}
Even the tiniest misstep, like a missing comma or mislabeled key, could introduce:
  • Data Integrity: A chance of mischievous JSON disrupting data retrieval or manipulation.
  • User Experience: The risk of misconfigurations causing a little chaos in content display or UI interactions.
  • Developer Efficiency: The potential time-drain of debugging and correcting JSON strings.

The Sneaky "What-If" of Mistakes

While seemingly trivial, an error in the JSON string can secretly trigger a chain reaction of operational disruptions, impacting developers and end-users alike. Though subtle, this issue beautifully underscores the crucial nature of precise data management and hints at the complex challenges that emerge when diverting from tried-and-true database normalization practices.

Problem 3: Data Retrieval

While seemingly simple, extracting data from the ContentSelection table unveiled its complexity in deserializing JSON strings. This complexity, particularly when handling larger objects like comprehensive dashboards requiring multiple content elements, swells even more in cloud environments like Azure, where every computational second counts—both in performance and pennies.

Problem 3.5 Search Operations: Not So Light on Serialized Columns

Embarking on search operations within serialized columns is not just a stroll in the tech park. Here we are, navigating through JSON strings, demanding not just a splash, but a veritable ocean of additional computational prowess—especially when side-eyed with the columnar searches in our ever-so-organized normalized database setup. And as those JSON objects swell in size and complexity, our search operations mutate, becoming an increasingly obstinate beast, voraciously consuming resources and casually nudging our end-users’ response times into the slow lane.
And nvarchar(max), oh what a whimsical challenge for standard SQL server full search – a perfect storm of complexity all bundled up with a neat little bow. Thus, the serialized columns sashay through, daring us to keep pace without tripping over the nuances of adept database management. May the odds be ever in our favor!

Problem 4: The Need for a Schema Update

We had multiple model classes within the source code with this comment.
// Careful with renaming any of the properties here!!! - will break existing content parts
That's it. That explains it all.

The Solution: A Nod to 1NF

To alleviate the problems mentioned above, we pivoted towards adhering to the First Normal Form (1NF). It entailed restructuring our database, primarily by creating distinct tables for each content type, thus eliminating the need for JSON serialization within the database.
1-- Original table, omitting the JSON column
2create table [dbo].[ContentSelection]
3(
4    [ContentPartSelectionID] int identity (1, 1) not null,
5    [ProfileTabID] int not null,
6    [ContentType] int not null,
7);
8
9-- Additional tables
10create table [dbo].[ChartConfiguration]
11(
12    [ContentPartSelectionID] int not null,
13    [Title] nvarchar(max) null,
14    [ChartType] int not null, -- e.g., 1 = Bar, 2 = Line, 3 = Pie
15    [Values] nvarchar(max) not null,
16    [XName]  nvarchar(100) not null,
17    [YName]  nvarchar(100) not null
18);
19
20create table [dbo].[CustomContentConfiguration]
21(
22    [ChecklistConfigurationID] int identity (1, 1) not null,
23    [ContentPartSelectionID] int not null,
24    [Title] nvarchar(max) null,
25    [Html] nvarchar(max) null,
26);
27... -- you get the idea
While effectively eliminating the need for extensive JSON operations within SQL, this solution required a meticulous migration process, ensuring existing customer data was transferred accurately and securely to the new structure.

Navigating Through Data Migration

The transition demanded the crafting of migration scripts for each existing customer instance.
1insert into ChartConfiguration(ContentPartSelectionID, Title, Values, Type, XName, YName)
2select
3    ContentPartSelectionID,
4    Json_Value(ConfigurationJSON, '$.title'),	   
5    Json_Value(ConfigurationJSON, '$.values'),
6    Json_Value(ConfigurationJSON, '$.ChartType'),
7    Json_Value(ConfigurationJSON, '$.xName'),
8    Json_Value(ConfigurationJSON, '$.yName')
9from ContentSelection 
10where ContentType = 1
Interestingly, this migration was multi-faceted and far from straightforward. For each content type, specific scripts were required. Moreover, each migration script must include checks to ensure data integrity, particularly validating JSON strings and managing instances where the Json_Value function misbehaved without apparent reason.
To safeguard against potential mishaps during the migration, the process was executed in two strategic phases:
  1. Data Migration: Transfer data without altering the original structure.
  2. Schema Update: Only after verifying the migrated data did we modify the schema, like deleting the JSON column.
This dual-phase approach provided a safety net, ensuring that if complications arose at any point, a revert to the previous database state was always within reach. While resource-intensive, this strategy underscored the importance of meticulous planning and safeguarding data during migration, especially when restructuring to adhere to normalization principles like 1NF.

Conclusion

This story concludes with a happy ending. We successfully migrated all our customers to the new structure and can now update the configuration of each content type without any problems. However, this story is not about the happy ending. It's about the issues we faced and how we solved them.
Take this story as a way to remind you to follow the best practices as much as possible. Even if you think you can do it better.

One last note

Yes, I called them problems, not "challenges" or "opportunities". Because they were problems.
Share post

Let’s stay connected

Do you want the latest and greatest from our blog straight to your inbox? Chuck us your email address and get informed.
You can unsubscribe any time. For more details, review our privacy policy