Skip to content
Snippets Groups Projects
user avatar
stefanpantucu authored
3366a318
History

faas-doc

Azure Functions

  • Serverless compute for Azure.
  • Offers the possibility to run functions written in C#, Java, Python to name a few
  • User must provide a custom handler for not supported languages
  • Functions can be written directly in the Azure Portal
  • Functions can be developed and tested locally and then deployed to a function app. Microsoft provides VS Code plugins to help the local development and ease the deployment.

The development can also be done in other editors via the cli. Azure Functions Core Tool provides the core runtime and templates that enable local development. Microsoft provides an installation guide.

Create local project

To create a project, use the following command:

func init MyProjFolder --worker-runtime dotnet-isolated

To use other languages, specify another name for --worker-runtime, such as node or python. The --language flag is supported when --worker-runtime is set to node and the options are typecript and javascript.

Create a function

To add a function to your project, run the func new command. The --template is used to select the trigger template.

e.g.

func new --template "Http Trigger" --name MyHttpTrigger

func templates list provides a list of available templates for the language of your choice.

Local testing

First things first, the Functions host must be started from the root of the project using func start. For HTTP-triggered functions, the URLs will be printend:

Http Function MyHttpTrigger: http://localhost:7071/api/MyHttpTrigger

Using this URL one can test the function before deploying it.

One thing to keep in mind is that when testing locally, no authentication is required to execute the request. When publishing the function, the URL used for testing is available in the Azure Portal and contains a query parameter code used for authentication (this is my assumption, but the URL given when deploying the funcion does not contain any query params and returns status code 401 when executing a GET request, but works with the param from the portal).

Deployment

There are three types of deployments:

  • project files
  • azure container apps
  • kubernetes cluster

So far I only tested the project files deployment. It can be done using the following command:

func azure functionapp publish <FunctionAppName>, where FunctionAppName is the name of the function app created in the azure portal.

Note: Before publishing the app, one has to be logged in to azure. I did this via the azure cli. Follow the instructions provided by Microsoft to install it for your OS.

Fun fact: I firstly created a dotnet project and uploaded it to the function app on azure. Then I created a second one in node, and when trying to publish it I got the following error:

Your Azure Function App has 'FUNCTIONS_WORKER_RUNTIME' set to 'dotnetIsolated' while your local project is set to 'node'.
You can pass --force to update your Azure app with 'node' as a 'FUNCTIONS_WORKER_RUNTIME'

So I ended up using --force

AWS Step Functions

Orchestrator for AWS lambda functions, connecting them into serverless workflows called state machines.

I followed the guide from here to create a mockup e-commerce system that checks the inventory when triggered by a message received in a queue.

Steps:

  • I created an Amazon SQS queue
  • Designed a workflow in Step Functions that describes how the orders are to be processed
  • Created a Lambda function that mocks an inventory microservice

The Step Functions workflow can be created visually, like this:

Or using ASL (Amazon States Language), which is a JSON-based structured language used to define a state machine. The config for the above image looks like this:

{
  "Comment": "An example of the Amazon States Language for starting a callback task.",
  "StartAt": "Check Inventory",
  "States": {
    "Check Inventory": {
      "Type": "Task",
      "Resource": "arn:aws:states:::sqs:sendMessage.waitForTaskToken",
      "Parameters": {
        "QueueUrl": "<INSERT SQS QUEUE URL HERE>",
        "MessageBody": {
          "MessageTitle": "Callback Task started by Step Functions",
          "TaskToken.$": "$$.Task.Token"
        }
      },
      "Next": "Notify Success",
      "Catch": [
      {
        "ErrorEquals": [ "States.ALL" ],
        "Next": "Notify Failure"
      }
      ]
    },
    "Notify Success": {
      "Type": "Pass",
      "Result": "Callback Task started by Step Functions succeeded",
      "End": true
    },
    "Notify Failure": {
      "Type": "Pass",
      "Result": "Callback Task started by Step Functions failed",
      "End": true
    }
  }
}

Writing the json generates the visuals and vice versa.

Create local project

First, we need to install AWS SAM CLI, which stands for Serverless Application Model. Follow the guid for your OS.

To create a project, use the following commands:

mkdir your-project
cd your-project
sam init

The user will be prompted with this choices:

1.    Template: AWS Quick Start Templates

2.    Language: Python, Ruby, NodeJS, Go, Java, or .NET

3.    Project name: (name of your choice - default is sam-app)

4.    Quick start application: Multi-step workflow

The command creates a directory with the name of your choice. There are two especially interesting files that you can take a look at:

  • template.yaml: Contains the AWS SAM template that defines your application's AWS resources.

  • statemachine/stockTrader.asl.json: Contains the application's state machine definition, which is written

To build the application use

sam build

Local testing

After the app is built, the function can be tested with this command:

sam local invoke <FunctionName> -e events/events.json

The project structure offers an event file that can be passed as an argument to act like the body of the request.

Deployment

Use this command when deploying for the first time:

sam deploy --guided

For the nex times the --guided flag is not needed anymore.

OpenFaaS

Framework for building serverless functions on the top of containers.

Create local project

First, we have to install faas-cli, which is the CLI for OpenFaas. I used this command, but there is a guide on their github repo for other types of OS:

$ curl -sSL https://cli.openfaas.com | sudo sh

I followed the demo on their homepage and used one of their function templates from the store:

faas-cli template store pull python3-http

Now having the template available on my machine, I can create my function based on it with the following command:

faas-cli new --lang python3-http --prefix stefanpantucu py1

This will generate the following files:

├── py1
│   ├── handler.py
│   ├── handler_test.py
│   ├── requirements.txt
│   └── tox.ini
├── py1.yml

handler.py is our function and py1.yml is the stack file, which has information about the gateway where to deploy the funcion, the image the function will be built into and so on. Here is how it looks:

version: 1.0
provider:
  name: openfaas
  gateway: http://127.0.0.1:8080
functions:
  py1:
    lang: python3-http
    handler: ./py1
    image: stefanpantucu/py1:latest

The latest tag should be replaced in order to keep track of the versions of the function.

Local testing

First, we need to build the image of our function:

faas-cli build -f py1.yml

Then, we run the function locally:

faas-cli local-run --image build/py1/Dockerfile -f py1.yml

This will prompt us with something like this, besides docker image build info:

Image: stefanpantucu/py1:0.1.4
Starting local-run for: py1 on: http://0.0.0.0:8080

Doing curl http://0.0.0.0:8080 will call the function and we will get the respone.

Deployment

There is an easy way, which is using:

faas-cli up -g $URL -f py1.yml

Where $URL is the url to your OpenFaaS instance (-g is to specify the gateway).

OR

faas-cli build
faas-cli push
faas-cli deploy

The first one does all three at the same time.

Here is a look at the OpenFaaS instance I deployed and tested on. The functions can also be teste from this ui:

If you pushed your functions in a container to a registry, you can also deploy the function from the ui: I used Docker Hub to push my images and I specified the name of my image and the tag.

Requirements

OpenFaaS requires you to have an instance of it running somewhere, whether it is locally or somewhere on the web. I think that they can host it for you, but this is not free of course.

What I had to do to test OpenFaaS was using Azure to create a kubernetes cluster where I could run OpenFaaS and then deploy my functions there. I am no k8s expert, but thankfully I found out this guide on how to use OpenFaaS on Azure Kubernetes Service (AKS) and I was able to do my tests.