i creating largish (80k collections, 200k documents per collection) database in mongo. when start pushing documents database, keep getting following error after ~800k inserts.
pymongo.errors.writeerror: 24: many open files
i have decent sized server (2 vcpu, 32 gb ram). ulimit
file open set unlimited. , limit nofile 999999 999999
set in /etc/init/mongodb.conf.
i have tried inserting sleep
(with hope mongodb close files after bit) in file insertion script hasn't helped either. thing works restarting mongodb after insertion has failed. process can resume.
how can make insertion process better without having pause after every few thousand inserts , restarting mongodb?
sharing python script transferring datapoint redis mongo
while (redis_conn.llen(redis_key) > 0) : redis_json = redis_conn.lpop(redis_key) json_doc = json.loads(redis_point) collection_name = json_doc['collection_name'] collection = mongodb[collection_name] document = { '1' : json_doc['1'], '2' : json_doc['2'] } mongo_write_response = collection.insert_one(document)
(for brevity have simplified document. actual document has around 20 data points.
Comments
Post a Comment