.Net Core – OpenFaas – MongoDB

After playing around with some OpenFaas functions I came across the alexellis/mongodb-function. After reading through the README and having a play around with it I thought I would have a go at porting it to C# using .Net Core. Not only did I do this for a personal challenge but I also thought it could be useful to others who prefer to develop using C#/.Net Core.

This function creates a connection to a MongoDB and maintains the connection for the lifetime of the function. This means that a cold start, time taken to create the connection, will only occur on the initial request. Therefore all successive requests are relatively fast.

The csharp-kestrel-mongo project is the result. This OpenFaas function template when created provides you with a function handler for which you can interact with a MongoDB. The function offers you the ability to specify environment variables that defines your MongoDB instance(s).

It follows a slightly different design architecture compared to the one described in the original mongodb-function.

Credit: Alex Ellis’ original template architecture https://github.com/alexellis/mongodb-function modified for this template

If you are familiar with OpenFaas, a quick rundown on how to get up and running with this function follows using Play with Docker (A free online docker playground).

If you are using Play with Docker, create a Docker instance and install OpenFaas following the Docker Swarm instructions specified on the OpenFaas docs, otherwise you can follow along with your own OpenFaas deployment. You will also need access to a MongoDB. If you are following along using Play with Docker this can be done by running the following Docker commands in your instance:

docker volume create mongodata

docker run --name mongodb -d -p 27017:27017 -v mongodata:/data/db mongo mongod

Once completed the MongoDB function can be downloaded and created:

faas-cli template pull https://github.com/Marcus-Smallman/csharp-kestrel-mongo.git

faas-cli new mongo-function --lang csharp-kestrel-mongo

Note: armhf is also supported so you can deploy this function to your raspberry pi(s)! Simply change the specified language to csharp-kestrel-mongo-armhf

This will pull the template and create a new function called ‘mongo-function’. A mongo-function.yml file will be created which is required to deploy your function. This file allows for environment variables to be set that your function can access. An environment variable called mongo_endpoint is required for this function which specifies where your MongoDB is accessible from. There are also some extra optional environment variables that can be set.

An example of the modified mongo_function.yml file follows:

name: faas

lang: csharp-kestrel-mongo
handler: ./mongo-function
image: <docker-registry>/mongo-function
mongo_endpoint: <your-mongo-endpoint>:27017
mongo_database_name: my_mongo_database
mongo_collection_name: my_mongo_collection

Note: Don’t forget to modify your gateway address! I always forget to do this.

If you are following along with Play with Docker replace the <your-mongo-endpoint> with the IP of the docker instance created and replace the gateway value to your OpenFaas endpoint (the URL).  As you will be also be building and pushing this function to a docker registry, you will need to make sure that <docker-registry> is replaced with the one you will use for this function.

Now that we have everything set up we can get to the fun part, the actual function logic! This will all be done in the generated FunctionHandler.cs file:

public class FunctionHandler
public Task<string> Handle(object input)

var response = new ResponseModel()
response = input,
status = 201

return Task.FromResult(JsonConvert.SerializeObject(response));

By default the template inserts the request body into MongoDB and returns a response object with the provided input and a HTTP status code of 201. This can of course be changed to do anything you want from creating new MongoDB collection to updating all documents that contain the value OpenFaas. The function can then be deployed:

faas-cli build -f mongo-function.yml

faas-cli push -f mongo-function.yml

faas-cli deploy -f mongo-function.yml

To access and play around with the function we can head over to the OpenFaas gateway UI:

Data can then be sent via the request body to the function as seen in the screen shot above. This will then, if successful, return your given input and a status code of 201.

We can also prove that the data has been written to our MongoDB by using a MongoDB client to connect to our MongoDB and check that the data has been stored:

Cool, huh?

For more technical and in depth usage of the function template please read the documentation described in the project repository.

GitHub project: csharp-kestrel-mongo

– Marcus

Writing OpenFaas Serverless Functions in Go


In this post I will take you through the process of writing a Serverless function in Go with OpenFaas.

Note: Knowledge of Linux and Go will be very helpful.

Getting Started

Before we get to writing Serverless functions in Go, you need to have a OpenFaas deployment. If you do, great, you can skip to the next section. If you don’t, then you can follow Alex’s (founder of OpenFaas) tutorials on how to get a deployment up and running. He has a lot of great tutorials and posts that have heavily influenced the creation of this post. You should also check out his post on creating Go Serverless functions as well as he goes much more in depth.

Writing a Serverless function

Note: I will be developing the following function on a raspberry pi. This should not matter though as long as you can run the OpenFaas CLI and build and deploy Docker containers.

The following function that we are going to create will simply return the current time on the machine that executes the function.

First create the function project.

faas-cli new --lang=go-armhf get-time
Note: As I am running Go on ARM architecture the following suffix '-armhf' was added to the '--lang' parameter. This is not required if you are developing and building Go on x86.

What that command above should have done is create 3 seperate items;

  • template folder which holds all the templates required to build and run Serverless functions with your chosen language in OpenFaas.
  • A get-time.yml file that holds the configuration of your function. For example, the name of your function as a docker image, where the function will deploy to and the location of your function handler (the actual code of your function).
  • A get-time folder that will hold your function code.

If you have those three we can open up the handler.go file in the get-time folder. You should see the following:

package function

import (

// Handle a serverless request
func Handle(req []byte) string {
	return fmt.Sprintf("Hello, Go. You said: %s", string(req))

First we want to import the time package so that we can get the current time.

import (

Now we can change the return message with the current time.

return fmt.Sprintf("The current time on this machine is %s", time.Now())

Before we can build, push and deploy our changes we need to make sure that out get-time.yml file has the correct configuration. The following is an example of my get-time.yml file. I have highlighted the changes that I had made in mine.

    name: faas

        lang: go-armhf
        handler: ./get-time
        image: marcussmallman/get-time

Next, we can build this function and deploy it to OpenFaas.

faas-cli build -f get-time.yml

faas-cli push -f get-time.yml

faas-cli deploy -f get-time.yml

If the 3 commands above were successful then we can head over to the OpenFaas UI and expect to see the get-time function there.

Now we can press the Invoke button which will call our function that we just created and hopefully return the current time.

And there we have it.

To End

As simple as this post was to follow, I think it is a great example of how powerful and practical Serverless functions can be. The fact that a function is what scales depending on load is just awesome and potentially very beneficial. That’s not to say it’s an answer to all problems, but I think it is definitely an exciting new technology that we should all be aware of.

– Marcus