Docunext


Importing a Large Dataset in MongoDB

March 4th, 2010

Last night I imported a large, but not outrageously large, dataset into a MongoDB database. The complication might have been the structure of each document. They weren't too complicated, just a hash with two keys, one having a string as a value, and the other having an array.

At first I was using Ruby1.9.1, but it was taking too long so I switched to using the command line interface, "mongoimport".

The cool thing about this is that it can import JSON directly. I converted my data set to JSON format, saved it to a single file with about 23,000 JSON objects, and then ran it:

mongoimport --host 192.168.8.103 --db doculabsappone -c tags < tmp/tags.json

It was way faster than using Ruby1.9.1!

¥

Yearly Indexes: 2003 2004 2006 2007 2008 2009 2010 2011 2012 2013 2015 2019 2020 2022