JSON_MODIFY vs multiple rows (performance)

Wondering if JSON_MODIFY is more performant compared to getting data from multiple rows.

My search has included some stack overflow responses but they were comparing multiple rows or just updating a single large JSON inside one row. In this case, multiple rows seem to be the most favorable way to handle this.

But no one mentioned JSON_MODIFY.

So to clarify. Is it more performant to use JSON_MODIFY and add to the current JSON or create a new JSON and add a new row to the database?

Thanks for the tips!

I am not quite sure I understand because what your trying to compare is really operation on data vs accessing the data.

Most of the time anything implemented on the database can be more efficient than trying to read, modify and then insert into a database across a network connection. Many times they are also optimized and written in highly optimized languages like C.

I am not sure what you mean by adding a new row to the database. If you are updating JSON you are typically wanting to modify a value already in place on the database. On average, updates are usually faster than creating everything from scratch. Especially if you are updating a single field.

1 Like

So basically let’s say you have 10,000 users and each user might receive 10,000 messages. Is it better to update one main json with JSON_MODIFY or have multiple rows created for each message for each user. E.g. each user has their own json.

Everyone suggests multiple rows but I haven’t seen anyone mention using JSON_MODIFY to alter the main json and its implications on performance.

When you use one row for one message, you should not use json. You should store the json data separated as one attribute = one database column. In that case you have much quicker search possibilities if you need them

Anyways modern databases are all heavily optimized to work with json. So they can create indexes and even full text search on json data, the performance win to extract the json and store it into separate columns is only senseful if you have huge data and massive usage of it (and I don’t talk about a few megabytes and a few hundred requests a second).

The biggest disadvantage I see from storing json in a database is the maintaining of the data. While I can easily search in the database with a db management tool like sqldeveloper, navicat, phpmyadmin etc when I have the data stored in columns, it’s pretty hard to find anything if I have only json columns with super long json in it.

1 Like

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.