SQL Server added native JSON support back in SQL Server 2016, and as of SQL Server 2025, it officially introduced a dedicated JSON data type. If you’re on an older version, you’ve probably been storing JSON in NVARCHAR(MAX) columns and parsing it with functions like JSON_VALUE() and OPENJSON(). That still works, but the new JSON type gives you validation, better storage, and cleaner semantics.
And along with the JSON type, we get some new JSON related features. Let’s take a quick walk through.
What the JSON Data Type Actually Does
When you declare a column as JSON, SQL Server automatically validates that whatever you insert is well-formed JSON. If it’s not, the insert fails. No more silently storing malformed strings and discovering the problem downstream.
Under the hood, SQL Server stores the JSON in an optimized binary format (UTF-8), which is more compact than storing raw text in NVARCHAR(MAX) and faster to parse. When you read it back out, you get valid JSON text.
Example
Say you’re building a SaaS platform and each customer account has a set of feature entitlements. These could be things like which modules they’ve unlocked, their usage limits, and any custom overrides. This data is highly variable per account, changes over time, and doesn’t fit neatly into normalized columns. It’s a good fit for JSON.
Let’s create a table with a JSON column:
CREATE TABLE account_entitlements (
account_id INT NOT NULL PRIMARY KEY,
account_name NVARCHAR(255) NOT NULL,
plan_tier NVARCHAR(50) NOT NULL,
entitlements JSON NOT NULL,
updated_at DATETIME2 NOT NULL DEFAULT SYSUTCDATETIME()
);
We defined the entitlements column as JSON.
Now let’s insert a couple of accounts:
INSERT INTO account_entitlements (account_id, account_name, plan_tier, entitlements)
VALUES
(
1001,
'Meridian Analytics',
'enterprise',
'{
"modules": ["reporting", "data_export", "api_access", "sso"],
"limits": {
"seats": 500,
"api_calls_per_day": 100000,
"data_retention_days": 730
},
"custom_overrides": {
"white_label": true,
"dedicated_support": true
}
}'
),
(
1002,
'Foxridge Consulting',
'growth',
'{
"modules": ["reporting", "data_export"],
"limits": {
"seats": 25,
"api_calls_per_day": 5000,
"data_retention_days": 90
},
"custom_overrides": {}
}'
);
Output:
(2 rows affected)
And importantly, if we try to insert invalid JSON, SQL Server rejects it immediately:
INSERT INTO account_entitlements (account_id, account_name, plan_tier, entitlements)
VALUES (1003, 'Bad Corp', 'starter', '{ modules: ["reporting"] }');
Output:
Msg 13609, Level 16, State 9, Line 1
JSON text is not properly formatted. Unexpected character 'm' is found at position 2.
Querying JSON Columns
To query JSON columns, you can use the same JSON functions you probably already know, including JSON_VALUE() for scalar values, JSON_QUERY() for objects or arrays, and OPENJSON() to shred an array into rows.
You can also use any of the new JSON features that were introduced alongside the JSON type in SQL Server 2025, such as the JSON_CONTAINS() function.
You can use JSON_VALUE() to get specific scalar values:
SELECT
account_name,
JSON_VALUE(entitlements, '$.limits.seats') AS seat_count,
JSON_VALUE(entitlements, '$.limits.api_calls_per_day') AS daily_api_limit,
JSON_VALUE(entitlements, '$.limits.data_retention_days') AS data_retention_days
FROM account_entitlements;
Result:
account_name seat_count daily_api_limit data_retention_days
------------------- ---------- --------------- -------------------
Meridian Analytics 500 100000 730
Foxridge Consulting 25 5000 90
Let’s find all accounts that have SSO enabled:
SELECT account_name, plan_tier
FROM account_entitlements
WHERE JSON_VALUE(entitlements, '$.custom_overrides.white_label') = 'true';
Result:
account_name plan_tier
------------------ ----------
Meridian Analytics enterprise
We can shred the modules array so we get one row per module per account:
SELECT
a.account_name,
m.value AS module_name
FROM account_entitlements a
CROSS APPLY OPENJSON(JSON_QUERY(entitlements, '$.modules')) m;
Output:
account_name module_name
------------------- -----------
Meridian Analytics reporting
Meridian Analytics data_export
Meridian Analytics api_access
Meridian Analytics sso
Foxridge Consulting reporting
Foxridge Consulting data_export
One platform caveat worth knowing: OPENJSON() doesn’t accept the native JSON type on all platforms yet. In SQL Server 2025 on-premises it works fine, but on Azure SQL Database, Managed Instance, and Fabric, you’ll need to explicitly cast the column to NVARCHAR(MAX) first, otherwise you’ll hit an implicit conversion error:
-- Use this pattern when targeting Azure SQL DB / Managed Instance / Fabric
SELECT
a.account_name,
m.value AS module_name
FROM account_entitlements a
CROSS APPLY OPENJSON(CAST(entitlements AS NVARCHAR(MAX)), '$.modules') m;
On SQL Server 2025 on-prem, the original form without the cast works as-is.
That being said, it sounds like a temporary issue, and things could have changed by the time you read this.
Updating JSON Values
There are two ways to update JSON columns in SQL Server 2025, and which one you should use depends on whether your column is the native JSON type.
The modify Method (preferred for JSON columns)
When you’re working with a column declared as JSON, the .modify() method is what Microsoft recommends. The reason is that it can perform in-place modifications directly on the stored binary representation. This means it doesn’t have to deserialize the whole document, apply the change, and write it back. For strings, in-place modification is possible when the new value is the same length or shorter. For numbers, it’s possible when the new value fits within the same type and range.
The syntax uses dot notation on the column name directly in the SET clause.
Update a numeric limit in-place:
UPDATE account_entitlements
SET entitlements.modify('$.limits.seats', 50)
WHERE account_id = 1002;
Update a string value:
UPDATE account_entitlements
SET entitlements.modify('$.plan_tier', 'enterprise')
WHERE account_id = 1002;
Note that the modify method is currently in preview and only available in SQL Server 2025 on-premises. It’s not yet on Azure SQL Database or Managed Instance. But of course, this will change in due course.
JSON_MODIFY() (works everywhere)
JSON_MODIFY() is the function you’ve probably used before, and it still works fine on JSON columns. The difference is that it takes the full JSON document as input, applies the change, and returns a new string. The column gets fully rewritten rather than patched. For small documents this doesn’t matter much, but for large blobs updated frequently it’s less efficient than than the modify method.
You can update a specific key without rewriting the entire JSON blob using JSON_MODIFY():
UPDATE account_entitlements
SET entitlements = JSON_MODIFY(entitlements, '$.limits.seats', 50)
WHERE account_id = 1002;
Here’s one that adds a new module to the array for an account:
UPDATE account_entitlements
SET entitlements = JSON_MODIFY(
entitlements,
'append $.modules',
'webhooks'
)
WHERE account_id = 1002;
Checking for Values with JSON_CONTAINS()
SQL Server 2025 also introduced JSON_CONTAINS(), which lets you check whether a specific value exists at a given path in a JSON document. It returns 1 if found, 0 if not, and NULL if the path doesn’t exist or any argument is null.
The syntax goes like this:
JSON_CONTAINS(target_json, search_value, path_expression)
One thing to know upfront is that the path argument is required, and if the path points to an array, you need to use the [*] wildcard. There’s no “search the whole document” shortcut yet.
Going back to our entitlements table, a common operation would be checking whether an account has a specific module enabled before doing something with it. Instead of shredding the array with OPENJSON() and filtering, you can use JSON_CONTAINS() directly in a WHERE clause:
-- Find all accounts that have the 'api_access' module enabled
SELECT account_name, plan_tier
FROM account_entitlements
WHERE JSON_CONTAINS(entitlements, 'api_access', '$.modules[*]') = 1;
Result:
account_name plan_tier
------------------ ----------
Meridian Analytics enterprise
You can also use it to check a scalar value at a specific path:
-- Find accounts where dedicated support is turned on
SELECT account_name
FROM account_entitlements
WHERE JSON_CONTAINS(entitlements, CAST(1 AS BIT), '$.custom_overrides.dedicated_support') = 1;
Result:
account_name
------------------
Meridian Analytics
Notice the CAST(1 AS BIT). SQL Server matches the type of the search value against the JSON value, so if you’re checking a boolean in JSON, you need to pass a BIT. This is different from JSON_VALUE(), which always returns a character string and forces you to compare against 'true' or 'false' as text.
One upside of JSON_CONTAINS() over the OPENJSON()/filter approach is readability and intent. When you’re filtering rows based on whether a value exists in a JSON array, JSON_CONTAIN() makes the query self-explanatory at a glance. The OPENJSON() pattern works fine, but it can take a few seconds to parse mentally.
A Gotcha Worth Knowing
Drivers may not see JSON as JSON.
If you use CAST(something AS JSON) in a query, SQL Server returns a json type, but the sp_describe_first_result_set system procedure, which most data access clients and drivers rely on to understand result column types, doesn’t correctly report it as json. What drivers actually see depends on the TDS protocol version in use: TDS 7.4 and above sees varchar(max) with UTF-8 collation, and older TDS versions see nvarchar(max). This is purely a metadata issue. The data itself is fine, but it means your application code can’t assume the column type will come back labelled as JSON. Keep that in mind if you’re doing any type-based dispatch in your data layer.
For columns stored directly as JSON in a table (not cast at query time), most modern drivers handle this gracefully, but it’s worth verifying behavior for your specific stack.
When to Use the JSON Type (and When Not To)
JSON columns make sense when your data is genuinely variable in structure. This could include metadata, feature flags, config blobs, event payloads, etc. They’re also useful when you’re integrating with external APIs that hand you JSON and you want to store it without mapping every field to a column.
Where JSON columns fall short: anything you need to filter or join on frequently. SQL Server can’t index into JSON properties the way it can index a normal column (without some extra setup using computed columns). If you’re regularly querying WHERE JSON_VALUE(col, '$.user_id') = @id, you’re better off promoting that field to a real column.
The sweet spot is usually a hybrid. Keep your relational columns for the things you query against, and use JSON for the flexible, supplementary data that travels with a record but doesn’t drive queries.