A web app using MongoDB with Docker

Creating a web app using MongoDB with Docker

Although relational databases have long been the mainstay of business applications that use databases, NoSQL databases such as MongoDB have become popular relatively recently. This is because NoSQL databases can scale more efficiently than relational databases for certain kinds of data. There are still many types of data that are best served using a relational database. So, you should not blindly use a NoSQL database just because you think they are more modern. You should use a NoSQL database when it suits the data that you need to store and manage.

MongoDB is one of the more popular NoSQL databases, because with its collections, MongoDB can be treated similarly to a relational database. Since there still are many people working with databases that are more familiar with relational databases than NoSQL databases, using MongoDB will seem more familiar to those people. If you think of a collection in MongoDB as being like a table in a relational database, this is a useful way to get going with MongoDB.

Using Docker to provide MongoDB

A good way to use MongoDB as your database server is to use a Docker container to provide that service. I made a working directory named mongo_db. The path to this directory will be designated as /path/to/mongo_db. The first thing I created was the file called Dockerfile in that directory. This file just has a couple of lines:

/path/to/mongo_db/Dockerfile
FROM mongodb/mongodb-community-server:latest

The first line just pulls the latest community-server edition of MongoDB.

Creating the Docker image

To create the Docker image that provides MongoDB, I created the following script:

/path/to/mongo_db/create_image.sh
docker build -t mongo_img .

This will create an image called mongo_img from the Dockerfile in the current directory (.). I ran this script using the following commands:

$ cd /path/to/mongo_db
$ sh create_image.sh

Creating the Docker container

I created the following script to create the Docker container called mongo_cont.

/path/to/mongo_db/create_container.sh
docker container run -d --name=mongo_cont mongo_img

I ran this with the following commands:

$ cd /path/to/mongo_db
$ sh create_container.sh

After running this, you can see that the container is running by entering the following command:

$ docker container ls
CONTAINER ID   IMAGE       COMMAND                  CREATED          STATUS          PORTS       NAMES
406a2ae0c293   mongo_img   "python3 /usr/local/…"   10 seconds ago   Up 10 seconds   27017/tcp   mongo_cont

As you can see the container mongo_cont is running and is using the mongo_img image.

Entering the Docker container

You can enter the following commands to get a bash prompt inside the Docker container.

$ cd /path/to/mongo_db
$ docker start mongo_cont
$ docker exec -it mongo_cont bash

If you run, docker start mongo_cont and it is already running, the container continues to run. The last line uses the exec command to execute a command. The -it switch is for an interactive terminal. mongo_cont is the name of the container to run the command in, and bash is the command to run. You will get the root mongodb prompt. From there you can run the Mongo shell, mongosh to run the MongoDB server interactively.

$ docker exec -it mongo_cont bash
mongodb@406a2ae0c293:/docs$ mongosh
Current Mongosh Log ID:	689afb8add791d4cbf89b03c
Connecting to:		mongodb://127.0.0.1:27017/?directConnection=true&serverSelectionTimeoutMS=2000&appName=mongosh+2.5.6
Using MongoDB:		8.0.12
Using Mongosh:		2.5.6

For mongosh info see: https://www.mongodb.com/docs/mongodb-shell/


To help improve our products, anonymous usage data is collected and sent to MongoDB periodically (https://www.mongodb.com/legal/privacy-policy).
You can opt-out by running the disableTelemetry() command.

------
   The server generated these startup warnings when booting
   2025-08-12T08:22:04.401+00:00: Using the XFS filesystem is strongly recommended with the WiredTiger storage engine. See http://dochub.mongodb.org/core/prodnotes-filesystem
   2025-08-12T08:22:04.541+00:00: Access control is not enabled for the database. Read and write access to data and configuration is unrestricted
   2025-08-12T08:22:04.541+00:00: For customers running the current memory allocator, we suggest changing the contents of the following sysfsFile
   2025-08-12T08:22:04.541+00:00: For customers running the current memory allocator, we suggest changing the contents of the following sysfsFile
   2025-08-12T08:22:04.541+00:00: We suggest setting the contents of sysfsFile to 0.
   2025-08-12T08:22:04.541+00:00: Your system has glibc support for rseq built in, which is not yet supported by tcmalloc-google and has critical performance implications. Please set the environment variable GLIBC_TUNABLES=glibc.pthread.rseq=0
   2025-08-12T08:22:04.541+00:00: vm.max_map_count is too low
   2025-08-12T08:22:04.541+00:00: We suggest setting swappiness to 0 or 1, as swapping can cause performance problems.
------

test> show dbs
admin   40.00 KiB
config  12.00 KiB
local   40.00 KiB
test> use joe_db
switched to db joe_db
joe_db> db.students.insertOne({ first_name: "Jane", last_name: "Doe", major: "Biology"});
{
  acknowledged: true,
  insertedId: ObjectId('689afbdbdd791d4cbf89b03d')
}
joe_db> db.students.find({});
[
  {
    _id: ObjectId('689afbdbdd791d4cbf89b03d'),
    first_name: 'Jane',
    last_name: 'Doe',
    major: 'Biology'
  }
]
joe_db>

On line 1, the command to get a root prompt inside the Docker container is executed. On line 2, the mongosh command is run. This generates the output from line 3 to line 25. On line 26, the database named test is being used. This is the default name of the working database. The command on that line is show dbs, and this displays the databases that are permanently stored. On line 38 the command use joe_db will switch the name of the working database to joe_db. On line 32, db.students.insertOne() is used to insert an object. Note that the object is in JSON format. Most NoSQL databases consiste of JSON objects. Lines 33-36 are generated as output meaning that the insertOne() command has succeeded. Note that this will implicitly create the collection called students.

On line 37, the command db.students.find({}) is run. This will search inside the students collection and return anything that matches the argument to find. By using an empty set of curly braces ({}), this will return all objects. That is the output that is seen on lines 38-45.

Creating a script to initialize a MongoDB database

Although we can work with the MongoDB database using the mongo shell (mongosh), this is not the best way to initialize the database. For that, we want to run a JavaScript script.

Installing the Node.js packages

To run an initializing script, we need the mongodb Node package. We might as well install all the Node packages we will use at this point.

$ cd /path/to/mongo_db
$ npm install express mongodb mongoose cors

We will need express to provide the framework for interacting with the database. We need mongodb to connect with the MongoDB database. We will also use the mongoose package as this provides for a schema to interact with the database. This will make our functions similar to how we interact with a relational database. The cors package is needed to work around the browser showing a CORS error when we connect from our front end web application to the backend database.

Using mongodb to initialize the database

Now that we have mongodb installed, here is a script that uses that package to initialize the database. This is the code for init.js:

/path/to/mongo_db/init.js
const { MongoClient } = require('mongodb');

const url = 'mongodb://172.17.0.2:27017';
const client = new MongoClient(url);

const dbName = 'joe_db';
let student_array = [];

async function main() {
  try {
    await client.connect();
    const db = client.db(dbName);
    const students = db.collection('students');
    await students.drop();
    student_array.push({ _id: 1, first_name: "Jane", last_name: "Doe", major: "Biology" });
    student_array.push({ _id: 2, first_name: "John", last_name: "Doe", major: "Chemistry" });
    student_array.push({ _id: 3, first_name: "Bob", last_name: "Simmons", major: "Physics" });
    await students.insertMany(student_array);
    const result = await students.find({}).toArray();
    console.log(result);
  }
  catch (error) {
    console.error('Error occured:',error);
    process.exit();
  }
  finally {
    client.close();
  }
}

main();

Line 1 makes mongodb available to this script and names this module MongoClient. Line 3 defines the URL part of the connection string. The port for the MongoDB service is 27017 by default. The IP address is found using the following command:

$ docker start mongo_cont
$ docker inspect mongo_cont | grep -i ipaddress
            "SecondaryIPAddresses": null,
            "IPAddress": "172.17.0.2",
                    "IPAddress": "172.17.0.2",

Line 4 creates a client connection to the MongoDB service. Line 6 specifies the name of the database we want to use. Line 7 defines an array, student_array, that we will use to hold our student JSON objects. Lines 9-29 defines the main() function. This function is declared as async as we need to await processes that involve database interaction. The mongodb package defines their functions so that they can be called using await.

Lines 10-21 define the try block for the main() function. Line 11 attempts to connect to the MongoDB database. If that succeeds, line 12 makes it so that we will use the joe_db database. Line 13 gets the collection named students. On line 14, we wait for the students collection to drop all its objects (documents). Lines 15-17 add some student objects (documents) to student_array. Line 18 waits for the objects in that array to be inserted into the students collection. Line 19 waits for the find() command to return all of the objects in the collection. Then, this is converted into an array and is returned as result. Line 20 prints result to the console.

Lines 22-25 define the catch block for the main() function. If any error occurs inside the try block, the error will be displayed and the whole process is aborted.

Lines 26-28 define the finally block for the main() function. This block will close the connection to the database.

Line 31 runs the main() function.

Running this program produces the following output:

$ cd /path/to/mongo_db
----
$ node init.js
[
  { _id: 1, first_name: 'Jane', last_name: 'Doe', major: 'Biology' },
  { _id: 2, first_name: 'John', last_name: 'Doe', major: 'Chemistry' },
  { _id: 3, first_name: 'Bob', last_name: 'Simmons', major: 'Physics' }
]
----

So, now our joe_db database has been initialized so that it has a collection called students with three objects inside that collection. As mentioned previously, NoSQL databases store JSON objects. These objects are often referred to as documents. There are no restrictions on the contents of those JSON documents. So, the key names are not set in stone by the database. So, you can have documents that are missing some of the keys we used, or documents that have additional keys. This may seem very flexible, but it can also lead to problems in using the database where the programmer expects the documents to have the same keys (properties). This is why we will use the mongoose package as this provides a schema to help ensure that the documents all have the same keys.

Creating the middleware

Now that we have our backend database, we need to create the middleware that provides the routes that allow the front end web application to interact with the backend database. So, similar to what we did with the Docker postgresql container, we will create the index.js and queries.js files for that purpose. Let’s start with index.js:

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const app = express();

app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

Line 1 makes the express module available in the script. Line 2 makes the body-parser available in the script. Line 3 creates our express() application. Line 5 makes it so that our app treats the body content as JSON. Lines 6-10 allows the app to properly parse requests that come from URL-encoded payloads (such as with HTML forms).

Lines 11-13 define a default route that simply prints 'hello from MongoDB index.js'. This route is just used to test whether the app is working properly or not. Lines 15-17 cause the app to listen on port 3000 and display the message 'app listening on port 3000'. So here is what I ran in one terminal:

$ cd /path/to/mongo_db
$ node index.js
app listening on port 3000

While this is running, I open another terminal so I can test this simple route using curl:

$ curl http://localhost:3000/
hello from MongoDB index.js

So, the simple test route works. Before adding another route, I created the queries.js file:

/path/to/mongo_db/queries.js
const mongoose = require('mongoose');

async function connect() {
  try {
    await mongoose.connect("mongodb://172.17.0.2:27017/joe_db");
  }
  catch (error) {
    console.log(error);
    process.exit();
  }
}
connect();

const studentSchema = mongoose.Schema({
   first_name: String,
   last_name: String,
   major: String
});
const Student = mongoose.model('Student',studentSchema);

function getStudents(req, resp) {
   Student.find().
      then(students => {
         resp.status(200).json(students);
   });
}

module.exports = { getStudents };

Line 1 makes the mongoose module available. Lines 3-11 define an async function called connect(). Lines 4-6 define a try block that attempts to get a connection to the database using the connection string mongodb://172.17.0.2/27017/joe_db. That string consists of the driver name, mongodb, the IP address, 172.17.0.2, the port, 27017, and the database name, joe_db. If there are any errors in the connection string, you will get an empty result returned when you call getStudents(). Lines 7-10 define the catch block that will show an error in the terminal where you run node index.js. That error is not very descriptive, but at least the express app will stop running.

Line 12 calls the connect() function to get the connection.

Lines 14-18 define a schema that defines what the JSON objects should contain. This is sort of like having required fields for a SQL table. Line 19 creates the Student model that will be used to interact with the database. This is what will prevent storing JSON documents that don’t have the correct keys (properties). This model also is used to call a number of mongoose functions that make it easier to perform queries on the database.

Lines 21-26 define the getStudents() function. This uses the find() function from the mongoose module. This behaves similarly to the find() function you can use inside the mongo shell.

Line 28 exports the getStudents() function so it can be used inside of index.js.

Here is the new version of index.js that makes use of queries.js:

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const db = require('./queries.js');

app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});
app.get('/students', db.getStudents);

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

The new lines are 4 and 15. Line 4 makes the queries.js module available. Line 15 calls the getStudents() function exported from queries.js. Since we have made changes to index.js and queries.js, we need to stop the express server and restart it (node index.js). After doing that, we can test the new route using curl:

$ node index.js
app listening on port 3000
^C
$ node index.js
app listening on port 3000

Line 1 shows that the express server was running. Line 3 shows that I hit CTRL+C to stop that process. LIne 4 shows that the express server is started up again.

In another terminal, I ran the following curl command:

$ curl http://localhost:3000/students
[{"first_name":"Jane","last_name":"Doe","major":"Biology"},{"first_name":"John","last_name":"Doe","major":"Chemistry"},{"first_name":"Bob","last_name":"Simmons","major":"Physics"}]

As you can see, this new route works.

Adding a route to add a student

Let’s modify queries.js and then index.js to allow adding a student. Here is the new version of queries.js:

/path/to/mongo_db/queries.js
const mongoose = require('mongoose');

async function connect() {
  try {
    await mongoose.connect("mongodb://172.17.0.2:27017/joe_db");
  }
  catch (error) {
    console.log(error);
    process.exit();
  }
}
connect();

const studentSchema = mongoose.Schema({
   _id: Number,
   first_name: String,
   last_name: String,
   major: String
});
const Student = mongoose.model('Student',studentSchema);

function getStudents(req, resp) {
   Student.find().
      then(students => {
         resp.status(200).json(students);
   });
}

async function getLastStudentId() {
   let id;
   try {
      const students = await Student.find().sort('_id');
      const id = students[students.length-1]['_id'];
      return Number(id);
   }
   catch (error) {
      console.log(error);
   }
}

async function saveStudent(req, resp) {
   let id = await getLastStudentId() + 1;
   const { first_name, last_name, major } = req.body;
   let student = Student({ _id: id, first_name: first_name, last_name: last_name, major: major });
   await student.save();
   resp.status(201).send('student saved');
}

module.exports = { getStudents, saveStudent };

The new lines are 15, 29-39, 41-47 and 49. Line 49 has just been modified to export the saveStudent() function. Note that the new function getLastStudentId() is not exported as it is only used by the saveStudent() function. So, getLastStudentId() does not need to be called outside of queries.js.

Line 15 changes the schema by including the _id key. Note that MongoDB will add in an ObjectId for every object that is stored. However, that is a long number and is not sequential. So, you can use your own id type field by using _id. This will make for ids that are readily identifiable and make it easier to write code for the front end web application. But, this also means that you need to implement a method for coming up with that id, as there is not "auto increment" type data in MongoDB. This is why the getLastStudentId() function on lines 29-39 was created.

Lines 29-39 define the getLastStudentId() function. As stated in the previous paragraph, this function is used to help generate a sequential integer for the _id key. The try block on lines 31-35 uses find() then sort() to get all the objects in the *students collection and place them in order by the _id key. Line 33 gets that last _id used and returns this. Line 36-38 would handle any error that might occur in the try block.

Lines 41-47 define the saveStudent() function. This function gets its name from the save() function that can be called on any Student object being used on line 45. This function is declared as async because we need to wait for the getLastStudentId() function to complete and also we need to wait for the student.save() call to complete. Line 44 constructs a Student object using the model created on line 20 above. Upon the save() call working, line 46 sets the status to 201 and sends 'student saved' as a response message.

The index.js file needs to be updated too. Here is the new version of index.js:

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const db = require('./queries.js');

app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});
app.get('/students', db.getStudents);
app.post('/students', db.saveStudent);

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

The only new line is 16. This is where the route is set up so that it is a POST route using http://localhost:3000/students as the URL. Note that this route is associated with calling the saveStudent() function defined in *queries.js.

Testing the new route

Being a POST method, the easiest way to test this is to use Postman. Here is a screen shot showing a test of this new POST route:

saving student

Note the params are passed in the body using the x-www-form-urlencoded format. Note that the request verb is POST and that the URL is http://localhost:3000/students. After clicking on the Send button the response area shows student saved.

To check, I used Postman to run a GET request for the same URL. Here is a screen shot of that. I scrolled down far enough so you can see the newly added student.

getting students after save

Adding a route to update a student

The next route to create is one that will allow updating a student object. Let’s work on queries.js first. Here is the new version:

/path/to/mongo_db/queries.js
const mongoose = require('mongoose');

async function connect() {
  try {
    await mongoose.connect("mongodb://172.17.0.2:27017/joe_db");
  }
  catch (error) {
    console.log(error);
    process.exit();
  }
}
connect();

const studentSchema = mongoose.Schema({
   _id: Number,
   first_name: String,
   last_name: String,
   major: String
});
const Student = mongoose.model('Student',studentSchema);

function getStudents(req, resp) {
   Student.find().
      then(students => {
         resp.status(200).json(students);
   });
}

async function getLastStudentId() {
   let id;
   try {
      const students = await Student.find().sort('_id');
      const id = students[students.length-1]['_id'];
      return Number(id);
   }
   catch (error) {
      console.log(error);
   }
}

async function saveStudent(req, resp) {
   let id = await getLastStudentId() + 1;
   const { first_name, last_name, major } = req.body;
   let student = Student({ _id: id, first_name: first_name, last_name: last_name, major: major });
   await student.save();
   resp.status(201).send('student saved');
}

async function updateStudent(req, resp) {
   const id = Number(req.params.id);
   const { first_name, last_name, major } = req.body;
   try {
      const student = await Student.findByIdAndUpdate(id,
         { first_name: first_name, last_name: last_name, major: major },
         { new: true });
      resp.status(200).send('student updated');
   }
   catch (error) {
      throw error;
   }
}

module.exports = { getStudents, saveStudent, updateStudent };

The new lines are 49-60 and 62. Line 62 just updates the exported functions to include updateStudent(). Lines 49-60 define the updateStudent() function. Line 50 obtains the id number from the URL. So, the URL for this route would be http://localhost:3000/students/4 if we wanted to update a student with an _id = 4. Line 51 obtains the rest of the key values from the request body. Lines 52-57 define a try block that executes the findByIdAndUpdate() function. The first argument to this function on line 53 is the id. The next argument on line 54 is the object containing the updated values for the student. Line 55 has the third argument that makes it so that when the result returns on success, the new student (instead of the one that was replaced) is returned. This can be a source of confusion if you are using the returned student as part of the response to the request. Surprisingly, the default behavior of the findByIdAndUpdate() function is to return the old student in the result. Line 56 sets the status to 200 and sends 'student updated' in the response to the request. Line 58-60 define the .catch block that will throw any error that occurs.

The index.js file needs to be updated to define the new route. Here is the new version of index.js:

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const db = require('./queries.js');

app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});
app.get('/students', db.getStudents);
app.post('/students', db.saveStudent);
app.put('/students/:id', db.updateStudent);

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

The new line is line 17. Since this is an update, you need to use the PUT verb. Note that this is specifying that the URL have the id appended on to the end of it. So, you would use a URL like http://localhost:3000/students/4 if you wanted to update a student with an _id = 4. This route calls the updateStudent() function defined in queries.js to get its job done.

Since this is a PUT request, it is easiest to test in Postman. Here is a screen shot showing updating Ralph Moody’s major to Physics:

update student

In another terminal, I used the curl command to see the result of the update:

$ curl http://localhost:3000/students
[{"_id":1,"first_name":"Jane","last_name":"Doe","major":"Biology"},
{"_id":2,"first_name":"John","last_name":"Doe","major":"Chemistry"},
{"_id":3,"first_name":"Bob","last_name":"Simmons","major":"Physics"},
{"_id":4,"first_name":"Ralph","last_name":"Moody","major":"Physics","__v":0}]

Note that this shows that the last object was updated because there is a version key ("__v":0) with that object. Also, note that I added some returns to some of the lines so you don’t have to scroll to the right to see all the objects.

Looking ahead towards having a front end web application, I will need a route that gets me a student object based on the _id key. This is so that I have this route to get the student’s information to populate an edit dialog box. Here is the new version of queries.js that defines the function needed to support this route.

/path/to/mongo_db/queries.js
const mongoose = require('mongoose');

async function connect() {
  try {
    await mongoose.connect("mongodb://172.17.0.2:27017/joe_db");
  }
  catch (error) {
    console.log(error);
    process.exit();
  }
}
connect();

const studentSchema = mongoose.Schema({
   _id: Number,
   first_name: String,
   last_name: String,
   major: String
});
const Student = mongoose.model('Student',studentSchema);

function getStudents(req, resp) {
   Student.find().
      then(students => {
         resp.status(200).json(students);
   });
}

async function getLastStudentId() {
   let id;
   try {
      const students = await Student.find().sort('_id');
      const id = students[students.length-1]['_id'];
      return Number(id);
   }
   catch (error) {
      console.log(error);
   }
}

async function saveStudent(req, resp) {
   let id = await getLastStudentId() + 1;
   const { first_name, last_name, major } = req.body;
   let student = Student({ _id: id, first_name: first_name, last_name: last_name, major: major });
   await student.save();
   resp.status(201).send('student saved');
}

async function updateStudent(req, resp) {
   const id = Number(req.params.id);
   const { first_name, last_name, major } = req.body;
   try {
      const student = await Student.findByIdAndUpdate(id,
         { first_name: first_name, last_name: last_name, major: major },
         { new: true });
      resp.status(200).send('student updated');
   }
   catch (error) {
      throw error;
   }
}

async function getStudentById(req, resp) {
   const id = Number(req.params.id);
   try {
      const student = await Student.findById(id);
      if (student) {
         resp.status(200).json(student);
      }
   }
   catch (error) {
      throw error;
   }
}

module.exports = { getStudents, saveStudent, updateStudent, getStudentById };

The new lines are 63-74 and 76. Line 76 just updates the exports to include the getStudentById() function.

Lines 63-74 define the getStudentById() function. Line 64 gets the id from the end of the URL. Lines 65-70 define a try block that will get a student based on the student’s id. Line 66 calls the findById() function to return a student with the passed id. If a student is returned, then line 68 will set the status to 200 and send a JSON representation of that student object. If there are any errors, line 72 inside the catch block will throw the error.

Next, we need to edit the index.js file. Here is the new version of index.js

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const db = require('./queries.js');

app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});
app.get('/students', db.getStudents);
app.post('/students', db.saveStudent);
app.put('/students/:id', db.updateStudent);
app.get('/students/:id', db.getStudentById);

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

The new line is 18. Line 18 sets the route to be a GET request with a URL like http://localhost:3000/students/3 where we want to get the student with an _id = 3.

Since this is a GET request, this can be tested using curl. Be sure to stop and restart the express server. Then, run the curl command in another terminal.

$ node index.js
app listening on port 3000
^C

$ node index.js
app listening on port 3000

Then, in another terminal run the following:

$ curl http://localhost:3000/students/3
{"_id":3,"first_name":"Bob","last_name":"Simmons","major":"Physics"}

As you can see, this new route works.

Adding a route to delete a student

To complete our set of CRUD operations, we need to add a Delete capability. Here is the new version of queries.js that does this:

/path/to/mongo_db/queries.js
const mongoose = require('mongoose');

async function connect() {
  try {
    await mongoose.connect("mongodb://172.17.0.2:27017/joe_db");
  }
  catch (error) {
    console.log(error);
    process.exit();
  }
}
connect();

const studentSchema = mongoose.Schema({
   _id: Number,
   first_name: String,
   last_name: String,
   major: String
});
const Student = mongoose.model('Student',studentSchema);

function getStudents(req, resp) {
   Student.find().
      then(students => {
         resp.status(200).json(students);
   });
}

async function getLastStudentId() {
   let id;
   try {
      const students = await Student.find().sort('_id');
      const id = students[students.length-1]['_id'];
      return Number(id);
   }
   catch (error) {
      console.log(error);
   }
}

async function saveStudent(req, resp) {
   let id = await getLastStudentId() + 1;
   const { first_name, last_name, major } = req.body;
   let student = Student({ _id: id, first_name: first_name, last_name: last_name, major: major });
   await student.save();
   resp.status(201).send('student saved');
}

async function updateStudent(req, resp) {
   const id = Number(req.params.id);
   const { first_name, last_name, major } = req.body;
   try {
      const student = await Student.findByIdAndUpdate(id,
         { first_name: first_name, last_name: last_name, major: major },
         { new: true });
      resp.status(200).send('student updated');
   }
   catch (error) {
      throw error;
   }
}

async function getStudentById(req, resp) {
   const id = Number(req.params.id);
   try {
      const student = await Student.findById(id);
      if (student) {
         resp.status(200).json(student);
      }
   }
   catch (error) {
      throw error;
   }
}

async function deleteStudent(req, resp) {
   const id = Number(req.params.id);
   try {
      const student = await Student.findByIdAndDelete(id);
      if (student) {
         resp.status(200).send('student deleted');
      }
   }
   catch (error) {
      throw error;
   }
}

module.exports = { getStudents, saveStudent, updateStudent, getStudentById,
  deleteStudent };

The new lines are 76-87 and 89-90. Lines 89-90 just update the exports to include the deleteStudent() function.

Lines 76-87 define the deleteStudent() function. Line 77 gets the id from the end of the URL. Lines 78-83 define a try block that calls the findByIdAndDelete() function to delete the student with the id that is passed. If this succeeds, line 81 sets the status to 200 and sends a message of 'student deleted' in the response.

To make use of these changes, the index.js file is modified in the following way:

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const db = require('./queries.js');

app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});
app.get('/students', db.getStudents);
app.post('/students', db.saveStudent);
app.put('/students/:id', db.updateStudent);
app.get('/students/:id', db.getStudentById);
app.delete('/students/:id', db.deleteStudent);

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

The new line is 19. Line 19 creates a route that uses the DELETE verb and specifies a URL of http://localhost:3000/students/3 to delete a student with an _id = 3. For the route to work, the deleteStudent() function defined in queries.js is called.

Here I use curl to see the list of all the students:

$ curl http://localhost:3000/students
[{"_id":1,"first_name":"Jane","last_name":"Doe","major":"Biology"},{"_id":2,"first_name":"John","last_name":"Doe","major":"Chemistry"},{"_id":3,"first_name":"Bob","last_name":"Simmons","major":"Physics"}]

Next, I use Postman to test the DELETE route:

delete student

Then, I used curl to verify that the student with an id of 3 is gone.

$ curl http://localhost:3000/students
[{"_id":1,"first_name":"Jane","last_name":"Doe","major":"Biology"},{"_id":2,"first_name":"John","last_name":"Doe","major":"Chemistry"}]

So, the DELETE route works.

Creating the front end web application

One thing that I have not yet taken into account is the CORS problem when running the web application in a browser. The CORS policy only comes into effect when we use the web front end. So, before creating the web application, I need to modify the index.js file so that we avoid the browser enforcing the CORS policy and preventing the express server from responding. So, here is the modified version of index.js. Note that this will take into account that we will be serving out the web application using the http-server using localhost for the host and 8000 for the port.

/path/to/mongo_db/index.js
const express = require('express');
const bodyParser = require('body-parser');
const cors = require('cors');
const app = express();
const db = require('./queries.js');

app.use(cors({
   origin: 'http://localhost:8000'
}));
app.use(bodyParser.json());
app.use(
   bodyParser.urlencoded({
      extended: true
   })
);
app.get('/', function(req, resp) {
   resp.send('hello from MongoDB index.js\n');
});
app.get('/students', db.getStudents);
app.post('/students', db.saveStudent);
app.put('/students/:id', db.updateStudent);
app.get('/students/:id', db.getStudentById);
app.delete('/students/:id', db.deleteStudent);

app.listen(3000, () => {
   console.log('app listening on port 3000');
});

The new lines are 3 and 7-9. Line 3 makes the cors module available. Lines 7-9 make it so that the express server will only allow requests coming from http://localhost:8000. That takes into account that we will server out the web application using the following command:

$ cd /path/to/mongo_db
$ http-server -a localhost -p 8000

With that change to index.js, I copied the students_app.html from webapp_with_postgresql_docker to the /path/to/mongo_db folder and checked to make sure the proper routes are being used. Fortunately, the route URLs specified are the same as in the index.js file for that application. It turns out that one of the only changes that are needed are because the ID key for the MongoDB database is named _id, not id. The other change is that in returning the student using the getStudentById(), the mongoose function only returns a single student, not an array of students (as did pg's function). So, there are a few places that the web application had to be modified, and those are shown next:

/path/to/mongo_db/students_app.html
<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8">
    <script src="https://unpkg.com/axios/dist/axios.min.js"></script>
    <script>
      document.addEventListener('DOMContentLoaded', init);
      let students_data = [];
      let students_tbody;
      let add_dlg;
      let edit_dlg;
      let student_id;

      function removeChildren(elem) {
        while (elem.childNodes.length > 0) {
          elem.removeChild(elem.childNodes[0]);
        }
      }

      async function getStudents() {
        try {
          const students = await axios({
            method: 'get',
            url: 'http://localhost:3000/students'
          });
          students_data = students.data;
          console.log(students_data);
          updateStudentsTable();
        }
        catch (error) {
          console.log(error);
        }
      }

      async function handleAdd() {
        const add_fn_box = document.getElementById('add_fn_box');
        let first_name = add_fn_box.value.trim();
        const add_ln_box = document.getElementById('add_ln_box');
        let last_name = add_ln_box.value.trim();
        const add_maj_box = document.getElementById('add_maj_box');
        let major = add_maj_box.value.trim();
        try {
          await axios({
            method: 'post',
            url: 'http://localhost:3000/students',
            data: {
              first_name: first_name,
              last_name: last_name,
              major: major
            }
          });
          getStudents();
          add_dlg.close();
        }
        catch (error) {
        }
      }

      async function showEditDlg(event) {
        const id = Number(event.currentTarget.id);
        student_id = id;
        try {
          const results = await axios({
             method: 'get',
             url: `http://localhost:3000/students/${id}`,
          });
          console.log('results', results);
          //const student = results.data[0];
          const student = results.data;
          const edit_fn_box = document.getElementById('edit_fn_box');
          edit_fn_box.value = student.first_name;
          const edit_ln_box = document.getElementById('edit_ln_box');
          edit_ln_box.value = student.last_name;
          const edit_maj_box = document.getElementById('edit_maj_box');
          edit_maj_box.value = student.major;
          edit_dlg.showModal();
        }
        catch (error) {
          console.log(error);
        }
      }

      async function handleEdit() {
        const edit_fn_box = document.getElementById('edit_fn_box');
        let first_name = edit_fn_box.value.trim();
        const edit_ln_box = document.getElementById('edit_ln_box');
        let last_name = edit_ln_box.value.trim();
        const edit_maj_box = document.getElementById('edit_maj_box');
        let major = edit_maj_box.value.trim();
        await axios({
          method: 'put',
          url: `http://localhost:3000/students/${student_id}`,
          data: {
            first_name: first_name,
            last_name: last_name,
            major: major
          }
        });
        getStudents();
        edit_dlg.close();
      }

      async function handleDelete() {
        await axios({
          method: 'delete',
          url: `http://localhost:3000/students/${student_id}`
        });
        getStudents();
        edit_dlg.close();
      }

      function updateStudentsTable() {
        removeChildren(students_tbody);
        for (let i = 0; i < students_data.length; i++) {
          const student = students_data[i];
          let tr = document.createElement('tr');
          tr.addEventListener('click', showEditDlg);
          //const id = student.id;
          const id = student._id;
          tr.setAttribute("id", id);
          let td = document.createElement('td');
          let content = document.createTextNode(student.first_name);
          td.appendChild(content);
          tr.appendChild(td);
          td = document.createElement('td');
          content = document.createTextNode(student.last_name);
          td.appendChild(content);
          tr.appendChild(td);
          td = document.createElement('td');
          content = document.createTextNode(student.major);
          td.appendChild(content);
          tr.appendChild(td);
          students_tbody.appendChild(tr);
        }
      }

      function init() {
        students_tbody = document.getElementById('students_tbody');
        add_dlg = document.getElementById('add_dlg');
        const add_button = document.getElementById('add_button');
        add_button.addEventListener('click', () => { add_dlg.showModal(); });
        const add_cancel = document.getElementById('add_cancel');
        add_cancel.addEventListener('click', () => { add_dlg.close(); });
        const add_ok = document.getElementById('add_ok');
        add_ok.addEventListener('click', handleAdd);
        edit_dlg = document.getElementById('edit_dlg');
        const edit_cancel = document.getElementById('edit_cancel');
        edit_cancel.addEventListener('click', () => { edit_dlg.close(); });
        const edit_ok = document.getElementById('edit_ok');
        edit_ok.addEventListener('click', handleEdit);
        const delete_button = document.getElementById('delete_button');
        delete_button.addEventListener('click', handleDelete);
        getStudents();
      }
    </script>
  </head>
  <body>
    <h1>Students Table</h1>
    <button id="add_button">Add student</button><br>
    <dialog id="add_dlg">
      First Name:
      <input type="text" id="add_fn_box"><br>
      Last Name:
      <input type="text" id="add_ln_box"><br>
      Major:
      <input type="text" id="add_maj_box"><br>
      <br>
      <button id="add_cancel">Cancel</button>
      <button id="add_ok">Ok</button>
    </dialog>
    <dialog id="edit_dlg">
      First Name:
      <input type="text" id="edit_fn_box"><br>
      Last Name:
      <input type="text" id="edit_ln_box"><br>
      Major:
      <input type="text" id="edit_maj_box"><br>
      <br>
      <button id="edit_cancel">Cancel</button>
      <button id="edit_ok">Ok</button>
      <button id="delete_button">Delete student</button>
    </dialog>
    <table border="1">
      <thead>
        <tr>
          <th>First Name</th>
          <th>Last Name</th>
          <th>Major</th>
        </tr>
      </thead>
      <tbody id="students_tbody"></tbody>
    </table>
  </body>
</html>

The new lines are 67-69 and 118-119. Line 67 prints out results. If you look in the console you would see this:

results data

Note that data is a single object, not an array of objects as in the PostgreSQL example using the pg driver. So, line 68 is commented out, so you can see what the old code used to be, and line 69 shows the corrected code.

Line 118 shows the commented out old code that used student.id. Line 119 shows the corrected code which uses student._id.

Running the web application

To review the process of running the front end web application, here are the steps to follow:

If the express server is already running, stop it by typing CTRL+C. Then restart node index.js

/path/to/mongo_db/simple$ node index.js
app listening on port 3000
^C

/path/to/mongo_db/simple$ node index.js
app listening on port 3000

Then in another terminal, use http-server to serve out the web application.

$ cd /path/to/mongo_db
$ http-server -a localhost -p 8000
Starting up http-server, serving ./

http-server version: 14.1.1

http-server settings:
CORS: disabled
Cache: 3600 seconds
Connection Timeout: 120 seconds
Directory Listings: visible
AutoIndex: visible
Serve GZIP Files: false
Serve Brotli Files: false
Default File Extension: none

Available on:
  http://localhost:8000
Hit CTRL-C to stop the server

Then, open a browser to http://localhost:8000/students_app.html. You will be able to run the application and should be able to add a student, edit a student by clicking on it and delete a student by clicking on it and choosing Delete student in the edit dialog box.