Understanding DuckDB’s ARG_MIN_NULL() Function

DuckDB has a arg_min_null() function that works in a similar way to the arg_min() function. That is, it finds the row with the minimum value in one column and returns the corresponding value from another column at that row.

But there’s also a difference between these two functions. The main difference is in the way they deal with NULL values. Also, arg_min_null() only accepts two arguments, whereas arg_min() accepts an optional third argument. Additionally, there aren’t any aliases for arg_min_null() at the time of writing (arg_min() has a couple of aliases).

In this article we’ll look at how arg_min_null() works, and we’ll compare it with arg_min() to see how each function handles NULL values.

Continue reading

LIST_REVERSE() vs LIST_REVERSE_SORT() in DuckDB: What’s the Difference?

DuckDB has a good range of functions for dealing with lists and arrays. Amongst these are list_reverse() and a list_reverse_sort(). Looking at their names, you could be forgiven for thinking that these do the same thing. But they don’t.

If you’re wondering why DuckDB has a list_reverse() and a list_reverse_sort() function, read on.

Continue reading

Convert Column Values to a JSON Array with DuckDB’s JSON_GROUP_ARRAY() Function

The json_group_array() function in DuckDB is used to aggregate values into a JSON array, making it especially useful when working with structured or semi-structured data such as JSON. This function is part of DuckDB’s JSON extension and works similarly to DuckDB’s string_agg() or the group_concat() function in some other RDBMSs, but instead of returning a delimited string, it returns a well-formed JSON array.

This function is particularly helpful when we need to represent grouped or hierarchical data in a JSON format for export, reporting, or further transformation.

Continue reading

3 Ways to Get the Weighted Average in DuckDB

Weighted averages are common calculations in data analysis, allowing us to assign different levels of importance to individual values in our dataset. Unlike simple averages, where each value has equal impact, weighted averages let us incorporate the relative significance of each observation. This is particularly valuable for scenarios like calculating GPA (where courses have different credit weights), investment portfolio returns (where assets have varying allocations), or quality ratings (where reviewers have different expertise levels).

In this article, we’ll explore three ways of calculating weighted averages in DuckDB.

Continue reading

Extract All Values From a JSON Document With DuckDB’s JSON_TRANSFORM() Function

The json_transform() function in DuckDB is a handy tool for converting JSON strings into structured data types like STRUCT, MAP, and LIST. This allows you to directly query and manipulate nested JSON data using standard SQL, making it much easier to work with complex JSON objects and arrays.

Think of it as a way to cast your JSON data into a more usable, typed format within your database.

Continue reading

Get the Number of Elements in a JSON Array with JSON_ARRAY_LENGTH() in DuckDB

In DuckDB, the json_array_length() function is used to determine the number of elements in a JSON array. This function can return the count of items in the top-level array or in a nested array within it. It’s especially useful when working with complex JSON structures where arrays are embedded inside objects or other arrays. By combining it with JSON path expressions, you can target specific portions of the data for more precise analysis.

Continue reading

Create a TIMESTAMP Value in DuckDB with MAKE_TIMESTAMP()

The make_timestamp() function in DuckDB is a handy tool for creating timestamp values from individual components. It allows us to construct timestamp values using two alternative approaches; by specifying the number of milliseconds from epoch, or by specifying the year, month, day, hour, minute, and second components separately.

Let’s take a look at the make_timestamp() function, along with some examples.

Continue reading