- WaSQL Wired
- Posts
- JSON Recordsets are Your Friends
JSON Recordsets are Your Friends
Regardless of what database you use, you can probably harness JSON recordsets to improve your queries.
Many relational databases now support JSON columns. JSON columns within a relational database provide a bridge between the structured world of SQL and the flexibility of NoSQL. They allow you to leverage the strengths of both approaches within a single system. The JSON functions in many popular databases also offer a way to convert JSON into a table record set. This opens up all kinds of possibilities. Using JSON in a CTE is a much more efficient way to JOIN on an external set of data. This article explains how to convert a JSON dataset into a table in many of the most common databases.
MySQL and SAP HANA happen to create JSON records sets the same way.
SELECT *
FROM JSON_TABLE(
'[{"firstName":"Paulo","lastName":"Coelho"},{"firstName":"Markolo","lastName":"Hondachi"}]',
'$[*]'
COLUMNS (
first_name nvarchar(200) PATH '$.firstName',
last_name nvarchar(200) PATH '$.lastName'
)
) jt
Oracle is very close to MySQL and SAP HANA but notice the varchar definition difference. Oracle requires varchar2 instead of nvarchar.
SELECT *
FROM JSON_TABLE(
'[{"firstName":"Paulo","lastName":"Coelho"},{"firstName":"Markolo","lastName":"Hondachi"}]',
'$[*]'
COLUMNS (
first_name varchar2(200) PATH '$.firstName',
last_name varchar2(200) PATH '$.lastName'
)
) jt
PostgreSQL has a specific function called JSON_TO_RECORDSET to handle this
SELECT *
FROM JSON_TO_RECORDSET(
'[{"firstname":"Paulo","lastname":"Coelho"},{"firstname":"Markolo","lastname":"Hondachi"}]'
) AS jt(
firstname varchar(200),
lastname varchar(200)
)
Microsoft SQL Server also has their special function OPENJSON
SELECT *
FROM OPENJSON(
'[{"firstname":"Paulo","lastname":"Coelho"},{"firstname":"Markolo","lastname":"Hondachi"}]'
)
WITH (
firstname nvarchar(200) '$.firstname',
lastname nvarchar(200) '$.lastname'
)
SQLite uses JSON_EXTRACT and JSON_EACH
SELECT
JSON_EXTRACT(value,'$.firstName') as firstname,
JSON_EXTRACT(value,'$.lastName') as lastname
FROM JSON_EACH(
'[{"firstName":"Paulo","lastName":"Coelho"},{"firstName":"Markolo","lastName":"Hondachi"}]'
)
Last but not least is Snowflake
WITH BASE AS (
SELECT PARSE_JSON(
'[{"firstName":"Paulo","lastName":"Coelho"},{"firstName":"Markolo","lastName":"Hondachi"}]'
) data
)
SELECT
value:"firstName"::varchar(50) AS firstName,
value:"lastName"::varchar(50) AS lastName
FROM BASE, LATERAL FLATTEN(input=>data)
Extracting data from JSON record sets is definitely unique per database system but still very very powerful. Try using them with prepared statements and passing in the JSON dataset as a variable. Try using them as a CTE (Common Table Expression) to get rid of large IN statements in your queries. What other ways can you think of to use JSON record sets?