Accessing a MongoDB Atlas Cluster from within Google Cloud Functions Console

Logan May picture Logan May · Feb 19, 2018 · Viewed 7.8k times · Source

I'm writing a basic Google Cloud Function that is going to query a MongoDB Cluster from MongoDB Atlas. I'm writing inside the Google Console and I was sure to add "mongodb": "^3.0.2" to the dependencies in the package.json file.

Here's the function (I replaced the valid password, etc. in the uri for security):

/**
* Responds to any HTTP request that can provide a "message" field in the 
body.
 *
 * @param {!Object} req Cloud Function request context.
 * @param {!Object} res Cloud Function response context.
 */
exports.myPackage = (req, res) => {    
  var MongoClient = require('mongodb').MongoClient;
  var uri = "mongodb+srv://<USERNAME>:<PASSWORD>@<CLUSTER-NAME>-vtbhs.mongodb.net/test";
  MongoClient.connect(uri, function(err, client) {
    if (err) {
      console.log('err');
      console.log(err);
    } else {
      const collection = client.db("test").collection("devices");
    }
  });

  res.status(200).send('Success');
}

I'm sure the driver is up to date, and I copied most of this code directly from the Atlas docs. I've whitelisted all IPs from Atlas for testing.

Every time the function runs, I get the following error in the connect callback:

"{ Error: querySrv ESERVFAIL _mongodb._tcp.<CLUSTER-NAME>-vtbhs.mongodb.net
at errnoException (dns.js:28:10)
at QueryReqWrap.onresolve [as oncomplete] (dns.js:219:19)
code: 'ESERVFAIL',
errno: 'ESERVFAIL',
syscall: 'querySrv',
hostname: '_mongodb._tcp.<CLUSTER-NAME>-vtbhs.mongodb.net' }"

I also previously got an error like:

URI does not have hostname, domain name and tld at module.exports

Although that doesn't seem to be popping up anymore since I adjusted my password inside of Mongo (there may have been a non-html-encoded char in it).

Thanks in advance for any help!

Answer

JP Lew picture JP Lew · Mar 3, 2018

I had the exact same issue. It was failing for both MongoDB Atlas and mLab connections after running firebase deploy, but working locally using firebase serve.

I believe there are two problems:

  1. Outbound Networking is disabled on the free Spark plan. You have to upgrade to the Blaze Plan (pay-as-you-go) in order to do that. I just switched tiers and it works now.

The Spark plan allows outbound network requests only to Google-owned services. Inbound invocation requests are allowed within the quota. On the Blaze plan, Cloud Functions provides a perpetual free tier. The first 2,000,000 invocations, 400,000 GB-sec, 200,000 CPU-sec, and 5 GB of Internet egress traffic is provided for free each month. You are only charged on usage past this free allotment. Pricing is based on total number of invocations, and compute time. Compute time is variable based on the amount of memory and CPU provisioned for a function. Usage limits are also enforced through daily and 100s quotas. For more information, see Cloud Functions Pricing.

https://firebase.google.com/pricing/

You have to give a credit card, but as long as you remain under their data quota you don't get charged, apparently. I will try it out and see how it goes.

  1. After upgrading, your MongoDB Atlas connection might still be failing (while an mLab string would work). This is because your query URI is using the Mongo 3.6 SRV address, rather than the Mongo 3.4 driver's "mongodb://" protocol. Try switching the driver version and see if it works. enter image description here