一尘不染

在Redis中找到最大对象的最简单方法是什么?

redis

我在生产中有20GB
+的rdb转储。我怀疑有一组特定的按键使它blo肿。我希望有一种方法可以始终从静态转储分析中发现前100个最大对象,或者将其询问给服务器本身,而服务器本身有7M个对象。

像rdbtools这样的转储分析工具在这个(我认为)非常常见的用例中无济于事!

我当时想编写一个脚本,并使用“ redis-cli调试对象”对整个键集进行迭代,但是我感觉必须缺少某些工具。


阅读 339

收藏
2020-06-20

共1个答案

一尘不染

一个选项已添加到redis-cli: redis-cli --bigkeys

基于https://gist.github.com/michael-
grunder/9257326的示例输出

$ ./redis-cli --bigkeys

# Press ctrl+c when you have had enough of it... :)
# You can use -i 0.1 to sleep 0.1 sec every 100 sampled keys
# in order to reduce server load (usually not needed).

Biggest string so far: day:uv:483:1201737600, size: 2
Biggest string so far: day:pv:2013:1315267200, size: 3
Biggest string so far: day:pv:3:1290297600, size: 5
Biggest zset so far: day:topref:2734:1289433600, size: 3
Biggest zset so far: day:topkw:2236:1318723200, size: 7
Biggest zset so far: day:topref:651:1320364800, size: 20
Biggest string so far: uid:3467:auth, size: 32
Biggest set so far: uid:3029:allowed, size: 1
Biggest list so far: last:175, size: 51


-------- summary -------

Sampled 329 keys in the keyspace!
Total key length in bytes is 15172 (avg len 46.12)

Biggest   list found 'day:uv:483:1201737600' has 5235597 items
Biggest    set found 'day:uvx:555:1201737600' has 47 members
Biggest   hash found 'day:uvy:131:1201737600' has 2888 fields
Biggest   zset found 'day:uvz:777:1201737600' has 1000 members

0 strings with 0 bytes (00.00% of keys, avg size 0.00)
19 lists with 5236744 items (05.78% of keys, avg size 275618.11)
50 sets with 112 members (15.20% of keys, avg size 2.24)
250 hashs with 6915 fields (75.99% of keys, avg size 27.66)
10 zsets with 1294 members (03.04% of keys, avg size 129.40)
2020-06-20