I'm fairly new to MongoDb / Mongoose, more used to SQL Server or Oracle.
I have a fairly simple Schema for an event.
EventSchema.add({
pkey: { type: String, unique: true },
device: { type: String, required: true },
name: { type: String, required: true },
owner: { type: String, required: true },
description: { type: String, required: true },
});
I was looking at Mongoose Indexes which shows two ways of doing it, I used the field definition.
I also have a very simple API that accepts a POST and calls create on this collection to insert the record.
I wrote a test that checks that the insert of a record with the same pkey should not happen and that the unique:true is functioning. I already have a set of events that I read into an array so I just POST the first of these events again and see what happens, I expected that mongo DB would throw the E11000 duplicate key error, but this did not happen.
var url = 'api/events';
var evt = JSON.parse(JSON.stringify(events[0]));
// POST'ed new record won't have an _id yet
delete evt._id;
api.post(url)
.send(evt)
.end(err, res) {
err.should.exist;
err.code.should.equal(11000);
});
The test fails, there is no error and a duplicate record is inserted.
When I take a look at the collection I can see two records, both with the same pkey (the original record and the copy that I posted for the test). I do notice that the second record has the same creation date as the first but a later modified date.
(does mongo expect me to use the latest modified version record???, the URL is different and so is the ID)
[ { _id: 2,
pkey: '6fea271282eb01467020ce70b5775319',
name: 'Event name 01',
owner: 'Test Owner',
device: 'Device X',
description: 'I have no idea what\'s happening',
__v: 0,
url: '/api/events/2',
modified: '2016-03-23T07:31:18.529Z',
created: '2016-03-23T07:31:18.470Z' },
{ _id: 1,
pkey: '6fea271282eb01467020ce70b5775319',
name: 'Event name 01',
owner: 'Test Owner',
device: 'Device X',
description: 'I have no idea what\'s happening',
__v: 0,
url: '/api/events/1',
modified: '2016-03-23T07:31:18.470Z',
created: '2016-03-23T07:31:18.470Z' }
]
I had assumed that unique: true on the field definition told mongo db that this what you wanted and mongo enforced that for you at save, or maybe I just misunderstood something...
In SQL terms you create a key that can be used in URL lookup but you can build a unique compound index, to prevent duplicate inserts. I need to be able to define what fields in an event make the record unique because on a form data POST the submitter of a form does not have the next available _id value, but use the _id (done by "mongoose-auto-increment") so that the URL's use from other parts of the app are clean, like
/events/1
and not a complete mess of compound values, like
/events/Event%20name%2001%5fDevice%20X%5fTest%20Owner
I'm just about to start coding up the so for now I just wrote a simple test against this single string, but the real schema has a few more fields and will use a combination of them for uniqueness, I really want to get the initial test working before I start adding more tests, more fields and more code.
Is there something that I should be doing to ensure that the second record does not actually get inserted ?
It seems that you have done unique indexing(at schema level) after inserting some records in db.
please follow below steps to avoiding duplicates -
1) drop your db:
$ mongo
> use <db-name>;
> db.dropDatabase();
2) Now do indexing at schema level or db level
var EventSchema = new mongoose.Schema({
pkey: { type: String, unique: true },
device: { type: String, required: true },
name: { type: String, required: true },
owner: { type: String, required: true },
description: { type: String, required: true },
});
It will avoid duplicate record insertion with same pKey value.
and for ensuring the index, use command db.db_name.getIndexes()
.
I hope it helps. thank you