考虑一个包含以下文件的集合
Consider a collection with the following documents
{ "_id" : "aaaaaaaaaaaa", "title" : "Hello, World!", "date" : "Thursday, November 12, 2015", "time" : "9:30 AM", "endtime" : "11:30 AM" }, { "_id" : "bbbbbbbbbbbb", "title" : "To B or not to B", "date" : "Thursday, November 12, 2015", "time" : "10:30 AM", "endtime" : "11:00 AM" }, { "_id" : "cccccccccccc", "title" : "Family Time", "date" : "Thursday, November 12, 2015", "time" : "10:30 AM", "endtime" : "12:00 PM" }在这个简化的输出中,我有一些事件已经将它们作为字符串输入的开始时间,结束时间和日期。如何使用使用现有数据的 update()中的字段计算新的正确形成的Date() - 可以查询的类型数据。
In this simplified output, I have events that have had their start times, ending times, and dates all entered in as strings. How can I use fields in an update() that use the existing data to calculate new properly formed Date()-type data that I can actually query.
以下工作用于创建新的iso_start字段
The following works to create a new "iso_start" field
db.events.update({} ,{$ set:{iso_start:Date()}},{multi:true})
我想像我将能够构建一种类似的更新选择
I imagined I would be able to build a sort of update-select like so
db.events.update({},{$ set:{iso_start:Date(date + + time}},{multi:true})和 db.events.update({},{$ set:{iso_end:Date(date +时间)}},{multi:true})
但是我收到错误date is not defined。
but I get the error "date is not defined."
更新: this.date 和 this.time 使未定义的错误离开,但插入的日期是当前的日期时间。我尝试写新的Date(),但插入的日期是ISODate(0NaN-NaN NaNTNaN:NaN:NaNZ)
Update: this.date and this.time made the not defined error go away, however the dates inserted were for the present datetime. I tried writing new Date() but then the date inserted was ISODate("0NaN-NaN-NaNTNaN:NaN:NaNZ")
推荐答案您需要使用 .aggregate() 方法,它提供对聚合管道的访问。
You need use the .aggregate() method which provides access to the to the aggregation pipelines.
在您的 $项目中 阶段您需要使用 $ concat 操作符连接您的字段。
In your $project stage you need to use the $concat operator to concatenate your field.
然后,您可以使用聚合结果使用bulk操作效率
You can then use your aggregation result to update your collection using "bulk" operations for efficiency
var bulk = db.events.initializeOrderedBulkOp(); var count = 0; db.events.aggregate([ { "$project": { "iso_start": { "$concat": [ "$date", " ", "$time" ] }, "iso_end": { "$concat": [ "$date", " ", "$endtime" ] } }} ]).forEach(function(doc) { bulk.find({'_id': doc._id}).updateOne({ "$set": { "iso_start": new Date(doc.iso_start), "iso_end": new Date(doc.iso_end) } }); count++; if(count % 200 === 0) { // update per 200 operations and re-init bulk.execute(); bulk = db.events.initializeOrderedBulkOp(); } }) // Clean up queues if(count > 0) bulk.execute();此操作后,您的文档如下所示:
After this operation your documents look like this:
{ "_id" : "aaaaaaaaaaaa", "title" : "Hello, World!", "date" : "Thursday, November 12, 2015", "time" : "9:30 AM", "endtime" : "11:30 AM", "iso_start" : ISODate("2015-11-12T06:30:00Z"), "iso_end" : ISODate("2015-11-12T08:30:00Z") } { "_id" : "bbbbbbbbbbbb", "title" : "To B or not to B", "date" : "Thursday, November 12, 2015", "time" : "10:30 AM", "endtime" : "11:00 AM", "iso_start" : ISODate("2015-11-12T07:30:00Z"), "iso_end" : ISODate("2015-11-12T08:00:00Z") }
这不是故事的结尾,因为批量 API和他的相关方法在即将发布的版本(3.2版本)中被废弃,因此从该版本开始,我们将需要使用 db.collection.bulkWrite() 方法。
That is not the end of the story because the "Bulk" API and his associated methods are deprecated in the forthcoming release (version 3.2 ) thus from that version we will need to use the db.collection.bulkWrite() method.
var operations = []; db.events.aggregate([ { "$project": { "iso_start": { "$concat": [ "$date", " ", "$time" ] }, "iso_end": { "$concat": [ "$date", " ", "$endtime" ] } }} ]).forEach(function(doc) { var operation = { updateOne: { filter: { "_id": doc._id }, update: { "$set": { "iso_start": new Date(doc.iso_start), "iso_end": new Date(doc.iso_end) } } } }; operations.push(operation); }) operations.push({ ordered: true, writeConcern: { w: "majority", wtimeout: 5000 } }); db.events.bulkWrite(operations)更多推荐
使用连接字段的结果更新文档
发布评论