I’ve been doing some work recently with api.ai and looking at ways of making more interesting chat bots using their fulfilment functionality. This functionality allows you to call out to services to provide the processing for a user’s specific intent.
One way to deploy such a service would be on Google Cloud using their Cloud Functions. These are tiny pieces of Javascript (node) code which performs a function. They are true microservices and Google provide the entire platform (Platform as a Service [PaaS]).
Google’s online tools do allow you to code directly on the Cloud Console, but it’s pretty awkward and the code has to be vanilla JS. So I wanted to find out how easy it would be to make a more maintainable cloud-function codebase using Typescript.
I found a couple of other blog posts on using Typescript with Cloud Functions, but they are generally related to responding to Firebase events (on the Cloud Pub/Sub channels) and I needed to fiddle around quite a bit to make any of them work. I also really wanted to use GitLab CI to automatically deploy the cloud function to Google on pushing the code.
So, I started with a fresh webpack project and added the usual Typescript addons, including the Babili Webpack Plugin which provides the transpiling and tree-shaking.
My webpack config is pretty simple:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
const BabiliPlugin = require('babili-webpack-plugin');
var nodeExternals = require('webpack-node-externals');
module.exports = {
entry: "./src/index.tsx",
output: {
path: __dirname + "/dist",
filename: "index.js",
library: 'my-function-endpoint',
libraryTarget: 'commonjs',
publicPath: '/dist/'
},
resolve: {
extensions: ['.js', '.ts', '.tsx'],
modules: [__dirname + '/src', 'node_modules']
},
plugins: [
new BabiliPlugin()
],
devtool: 'source-map',
module: {
loaders: [
{
test: /\.tsx?$/,
loader: 'ts-loader',
options: {
transpileOnly: true
}
}
]
},
target: 'node',
externals: [nodeExternals()]
};
The interesting parts of this are [line 2] the import of webpack-node-externals
(which you
need to install with yarn as a dev dependency), [line 9] setting the library name to the
name of the cloud function endpoint, and [lines 32 and 33] which target
node and import those node functions as externals (externals are functions that will
be assumed to exist on compilation - and they will when you run on a node server).
The index.tsx
file is a small file that wraps the class I’m using for processing
in a function. One error I found when trying this all out is that a Google Cloud Functions
requires a function as the export from the entry-point module.
import {Request, Response} from 'express';
import ApiAIRouter from "./actions/api-ai-router/ApiAIRouter";
export = ( req: Request, res: Response ) => new ApiAIRouter().process(req,res);
You can see we import express
types (added to the build with yarn install @types/express
)
so that we can have fully typed request and response objects.
I won’t go into any depth about what the router does here, but it’s basically looking at action field in the JSON in the request body, deciding what to do with it then returning and appropriate response. Maybe I’ll go through an api.ai fulfilment example in another blog post.
The tsconfig.json
file looks like this:
{
"compilerOptions": {
"module": "commonjs",
"target": "es5",
"noImplicitAny": false,
"sourceMap": false,
"lib": ["es6", "dom.iterable"],
"outDir": "dist",
"experimentalDecorators": true,
"skipLibCheck": true,
"removeComments": true,
"jsx": "Preserve"
},
"exclude": [
"node_modules",
"dist"
]
}
Again, important bits are that the node_modules
and dist
directories are excluded and
that we’re targetting es5
(vanilla JS). The rest are not so important. Indeed, the JSX
stuff only needs to be there if you intend to return HTML or use templating somewhere in the
typescript.
Running webpack
in the project directory creates a single index.js
file (and a source map)
into the dist
directory which we can deploy to the cloud.
To do this, I wanted to use GitLab’s CI. I’ve already been through the CI in a previous blog post, so this yaml file should be pretty simple to understand
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
stages:
- build_js
- deploy_to_cloud
build javascript:
image: kkarczmarczyk/node-yarn
stage: build_js
before_script:
- yarn install
script:
- node_modules/webpack/bin/webpack.js
- cp package.json dist # Needed for runtime dependencies by the cloud function
artifacts:
paths:
- dist
only:
- master
deploy to cloud:
image: marcuswelz/gitlabci-docker-git-gcloud
stage: deploy_to_cloud
variables:
GIT_STRATEGY: none
before_script:
# Ensure we have the cloud deploy functions
- gcloud components install beta
# Authenticate against the cloud
- echo "$SERVICE_ACCOUNT_PRIVATE_KEY" > .tmp-keyfile.json
- gcloud auth activate-service-account --key-file .tmp-keyfile.json
script:
# Deploy the application
- cd dist
- gcloud beta functions deploy my-function --entry-point=my-function-endpoint --stage-bucket=my-stage-bucket --trigger-http
dependencies:
- build javascript
The main parts to note are that there are two build stages (build and deploy)
which use different docker images [defined on lines 6 and 20].
The first [line 6] uses the yarn-node
image to build the index.js
file from the repository;
The second [line 20] uses a docker image which already has the Google Cloud CLI installed to
deploy the javascript to the cloud.
I added our service account JSON Web Token (JWT) into GitLab as a secret variable,
and that appears here [line 29] as
$SERVICE_ACCOUNT_PRIVATE_KEY
. The value of this (the actual JWT) is written to a
temporary file [line 29] and we authenticate against it [line 30].
This is the key for the default app engine service account, which can be created and downloaded from
the Google Cloud Console.
As the dist
directory is shared as an artifact from the first build [line 15], then we can
simply change into that directory and deploy the function [line 34]. Note that the entry-point
argument
is set to the name of the library from the webpack-config.js
above.
I spent some time pushing and getting errors before I realised there is a perfectly good way to emulate the cloud function from the command line prior to pushing.
If you install the Google Cloud CLI:
sudo curl https://sdk.cloud.google.com | bash
Then install the beta
component into it (which contains the cloud functions deployment code):
gcloud components install beta
Then install the functions emulator (btw I found yarn didn’t work here, so use npm):
npm install -g @google-cloud/functions-emulator
You finally end up with a function emulator which you can deploy to and test your code:
functions-emulator start
functions-emulator deploy my-function --entry-point=my-function-endpoint --trigger-http
functions-emulator call my-function
I thought I’d write this post as I didn’t find such a post that was useful to me when I was trying to set this all up. Let me know in the comments below if it was useful for you!
Comments: