Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
998 views
in Technique[技术] by (71.8m points)

bitcoin - Too many open files error while reindexing btc blockchain by electrs

I'm using electrs backend API documentation to build btc blockchain index engine and local HTTP API - https://github.com/Blockstream/electrs on linux os.

During the process of indexing an error occurred (I repeated whole process more than once and error occurred in the same place - according to my interpretation always at the finish of reading process [moments after to be precise]):

DEBUG - writing 1167005 rows to RocksDB { path: "./db/mainnet/newindex/txstore" }, flush=Disable
TRACE - parsing 50331648 bytes
TRACE - fetched 101 blocks
DEBUG - writing 1144149 rows to RocksDB { path: "./db/mainnet/newindex/txstore" }, flush=Disable
TRACE - fetched 104 blocks
DEBUG - writing 1221278 rows to RocksDB { path: "./db/mainnet/newindex/txstore" }, flush=Disable
TRACE - skipping block 00000000000000000006160011df713a63b3bedc361b60bad660d5a76434ad59
TRACE - skipping block 00000000000000000005d70314d0dd3a31b0d44a5d83bc6c66a4aedbf8cf6207
TRACE - skipping block 00000000000000000001363a85233b4e4a024c8c8791d9eb0e7942a75be0d4de
TRACE - skipping block 00000000000000000008512cf84870ff39ce347e7c83083615a2731e34a3a956
TRACE - skipping block 0000000000000000000364350efd609c8b140d7b9818f15e19a17df9fc736971
TRACE - skipping block 0000000000000000000cc0a4fd1e418341f5926f0a6a5c5e70e4e190ed4b2251
TRACE - fetched 23 blocks
DEBUG - writing 1159426 rows to RocksDB { path: "./db/mainnet/newindex/txstore" }, flush=Disable
DEBUG - writing 1155416 rows to RocksDB { path: "./db/mainnet/newindex/txstore" }, flush=Disable
DEBUG - writing 232110 rows to RocksDB { path: "./db/mainnet/newindex/txstore" }, flush=Disable
DEBUG - starting full compaction on RocksDB { path: "./db/mainnet/newindex/txstore" }
DEBUG - finished full compaction on RocksDB { path: "./db/mainnet/newindex/txstore" }
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Error { message: "IO error: While open a file for random read: ./db/mainnet/newindex/txstore/000762.sst: Too many open files" }', src/new_index/db.rs:192:44
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Aborted (core dumped)

Size of db directory (where indexes are stored) is over 450GB. My open files limit is 1048576 (checked by ulimit -aH), so probably the problem is not there. I checked https://github.com/Blockstream/esplora/issues/133 task and no help. Any ideas what went wrong?

EDIT: Soft limits (after checking by "ulimit -n") were equal 1024 - it's was the source of the problem. Setting it to 65000 solved it. I set it by "ulimit -n 65000" what worked only during one session in currently opened terminal. I changed etc/security/limits.conf, but the changes were not saved globally.

question from:https://stackoverflow.com/questions/65847960/too-many-open-files-error-while-reindexing-btc-blockchain-by-electrs

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)
Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...