0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

sql mysql マニュアル書いてないけど、generate_series的な関数を書いてみた

Last updated at Posted at 2019-12-08

まえがき

マニュアル書いてないけど(Undocumented)、開始番号と終了番号を与えて連番取得するファンクションを作成してみた。

環境

$mysql --version
mysql  Ver 8.0.18 for Linux on x86_64 (MySQL Community Server - GPL)

スクリプト

jsonの力を借りて。戻り値json指定するのはマニュアル未記載

drop function if exists nats;
create function nats(start_rn bigint,end_rn bigint)
returns json deterministic
return(
  with recursive rec(n)as(
    select start_rn
    union all
    select n + 1 from rec where n < end_rn
  )select
    json_arrayagg(s1.jele)
  from(
       select json_objectagg("key",s1.n)over(order by s1.n) as jele from rec s1
  )s1
);

動作確認

mysql> select * from json_table(nats(1,5),"$[*]" columns(ele bigint path "$.key")) as s1;
+------+
| ele  |
+------+
|    1 |
|    2 |
|    3 |
|    4 |
|    5 |
+------+
5 rows in set (0.01 sec)

デフォルトの再帰深さ制限に1001だと引っかかる。

mysql> select max(ele) from json_table(nats(1,1001),"$[*]" columns(ele bigint path "$.key")) as s1;
ERROR 3636 (HY000): Recursive query aborted after 1001 iterations. Try increasing @@cte_max_recursion_depth to a larger value.

緩和

set session cte_max_recursion_depth = 2000;

取れる。

mysql> set session cte_max_recursion_depth = 2000;
Query OK, 0 rows affected (0.00 sec)

mysql> select max(ele) from json_table(nats(1,1001),"$[*]" columns(ele bigint path "$.key")) as s1;
+----------+
| max(ele) |
+----------+
|     1001 |
+----------+
1 row in set (0.01 sec)
mysql> select max(ele) from json_table(nats(1,2000),"$[*]" columns(ele bigint path "$.key")) as s1;
+----------+
| max(ele) |
+----------+
|     2000 |
+----------+
1 row in set (0.01 sec)

mysql> select max(ele) from json_table(nats(1,2001),"$[*]" columns(ele bigint path "$.key")) as s1;
ERROR 3636 (HY000): Recursive query aborted after 2001 iterations. Try increasing @@cte_max_recursion_depth to a larger value.

100万件は4秒ぐらいで帰ってきた。

mysql> set session cte_max_recursion_depth = 1000000;
Query OK, 0 rows affected (0.00 sec)

mysql> select max(ele) from json_table(nats(1,1000000),"$[*]" columns(ele bigint path "$.key")) as s1;
+----------+
| max(ele) |
+----------+
|  1000000 |
+----------+
1 row in set (4.21 sec)

1000万件は。。。

mysql> set session cte_max_recursion_depth = 10000000;
Query OK, 0 rows affected (0.00 sec)

mysql> select max(ele) from json_table(nats(1,10000000),"$[*]" columns(ele bigint path "$.key")) as s1;
ERROR 1301 (HY000): Result of json_binary::serialize() was larger than max_allowed_packet (67108864) - truncated

適当に変えてみた。

64MBぐらいから1GBぐらい。それでも無理だった。

ERROR 1301 (HY000): Result of json_binary::serialize() was larger than max_allowed_packet (67108864) - truncated
mysql> set session max_allowed_packet = 128MB;
ERROR 1232 (42000): Incorrect argument type to variable 'max_allowed_packet'
mysql> set session max_allowed_packet = 1280000000;
ERROR 1621 (HY000): SESSION variable 'max_allowed_packet' is read-only. Use SET GLOBAL to assign the value
mysql> set global max_allowed_packet = 1280000000;
Query OK, 0 rows affected, 1 warning (0.00 sec)
mysql> select max(ele) from json_table(nats(1,10000000),"$[*]" columns(ele bigint path "$.key")) as s2;
ERROR 1301 (HY000): Result of json_binary::serialize() was larger than max_allowed_packet (67108864) - truncated

100万件より大きい件数は100万件何回か繰り返せばいいんじゃないかな。

あとがき

Undocumentedです。

ファンクション作成の練習になった。

以上、ありがとうございました。

20191208追記

他にないかなって探したら、あった。こっちのほうが使いやすそうです。いろいろ機能ある。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?