With Cloud Run — the fully-managed serverless container platform on Google Cloud — you can quickly and easily deploy applications using standard containers. In this article, we will explain how to build a chat server with Cloud Run using Python as the development language. We will build it with the FastAPI framework, based on this FastAPI sample source code.

[Note that this article does not provide detailed descriptions of each service. Refer to other articles for details like Cloud Run settings and the cloudbuild.yaml file format.]

Chat server architecture

Chat Server Architecture

The chat server consists of two Cloud Run services: frontend and backend. Code management is done on GitHub. Cloud Build deploys the code, and chat messages are passed between users with Redis pub/sub and Memorystore.

Set the “Authentication” option on the Cloud Run frontend service to “Allow all traffic” for frontend and backend. The two services communicate with a WebSocket, and backend and Memorystore can be connected using a serverless VPC access connector.

Let's take a look at each service one by one.

Frontend

index.html

The frontend service is written only in HTML. Only modify the WebSocket connection part with a URL of backend Cloud Run in the middle. This code is not perfect as it is just a sample to show the chat in action.

code_block
[StructValue([(u'code', u'<!DOCTYPE html>\r\n<html>\r\n <head>\r\n <title>Chat</title>\r\n </head>\r\n <body>\r\n <h1>Chat</h1>\r\n <h2>Room: <span id="room-id"></span><br> Your ID: <span id="client-id"></span></h2>\r\n <label>Room: <input type="text" id="channelId" autocomplete="off" value="foo"/></label>\r\n <button onclick="connect(event)">Connect</button>\r\n <hr>\r\n <form style="position: absolute; bottom:0" action="" onsubmit="sendMessage(event)">\r\n <input type="text" id="messageText" autocomplete="off"/>\r\n <button>Send</button>\r\n </form>\r\n <ul id=\'messages\'>\r\n </ul>\r\n <script>\r\n var ws = null;\r\n function connect(event) {\r\n var client_id = Date.now()\r\n document.querySelector("#client-id").textContent = client_id;\r\n document.querySelector("#room-id").textContent = channelId.value;\r\n if (ws) ws.close()\r\n ws = new WebSocket(`wss://xxx-du.a.run.app/ws/${channelId.value}/${client_id}`);\r\n ws.onmessage = function(event) {\r\n var messages = document.getElementById(\'messages\')\r\n var message = document.createElement(\'li\')\r\n var content = document.createTextNode(event.data)\r\n message.appendChild(content)\r\n messages.appendChild(message)\r\n };\r\n event.preventDefault()\r\n }\r\n function sendMessage(event) {\r\n var input = document.getElementById("messageText")\r\n ws.send(input.value)\r\n input.value = \'\'\r\n event.preventDefault()\r\n document.getElementById("messageText").focus()\r\n }\r\n </script>\r\n </body>\r\n</html>'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efaa4feda50>)])]

Dockerfile

The Dockerfile is very simple. Because it is deployed as HTML, nginx:alpine is a good fit.

code_block
[StructValue([(u'code', u'FROM nginx:alpine\r\n\r\nCOPY index.html /usr/share/nginx/html'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efac28c8e50>)])]

cloudbuild.yaml

The last part of the frontend service is the cloudbuild.yaml file. You only need to edit the project_id and “frontend”.

code_block
[StructValue([(u'code', u"steps:\r\n # Build the container image\r\n - name: 'gcr.io/cloud-builders/docker'\r\n args: ['build', '-t', 'gcr.io/project_id/frontend:$COMMIT_SHA', '.']\r\n # Push the container image to Container Registry\r\n - name: 'gcr.io/cloud-builders/docker'\r\n args: ['push', 'gcr.io/project_id/frontend:$COMMIT_SHA']\r\n # Deploy container image to Cloud Run\r\n - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'\r\n entrypoint: gcloud\r\n args:\r\n - 'run'\r\n - 'deploy'\r\n - 'frontend'\r\n - '--image'\r\n - 'gcr.io/project_id/frontend:$COMMIT_SHA'\r\n - '--region'\r\n - 'asia-northeast3'\r\n - '--port'\r\n - '80'\r\n images:\r\n - 'gcr.io/project_id/frontend:$COMMIT_SHA'"), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efaa7708fd0>)])]

Backend Service

main.py

Let's look at the server Python code first, starting with the core ChatServer class.

code_block
[StructValue([(u'code', u'class RedisService:\r\n def __init__(self):\r\n self.redis_host = f"{os.environ.get(\'REDIS_HOST\', \'redis://localhost\')}"\r\n\r\n async def get_conn(self):\r\n return await aioredis.from_url(self.redis_host, encoding="utf-8", decode_responses=True)\r\n\r\n\r\nclass ChatServer(RedisService):\r\n def __init__(self, websocket, channel_id, client_id):\r\n super().__init__()\r\n self.ws: WebSocket = websocket\r\n self.channel_id = channel_id\r\n self.client_id = client_id\r\n self.redis = RedisService()\r\n\r\n async def publish_handler(self, conn: Redis):\r\n try:\r\n while True:\r\n message = await self.ws.receive_text()\r\n if message:\r\n now = datetime.now()\r\n date_time = now.strftime("%Y-%m-%d %H:%M:%S")\r\n chat_message = ChatMessage(\r\n channel_id=self.channel_id, client_id=self.client_id, time=date_time, message=message\r\n )\r\n await conn.publish(self.channel_id, json.dumps(asdict(chat_message)))\r\n except Exception as e:\r\n logger.error(e)\r\n\r\n async def subscribe_handler(self, pubsub: PubSub):\r\n await pubsub.subscribe(self.channel_id)\r\n try:\r\n while True:\r\n message = await pubsub.get_message(ignore_subscribe_messages=True)\r\n if message:\r\n data = json.loads(message.get("data"))\r\n chat_message = ChatMessage(**data)\r\n await self.ws.send_text(f"[{chat_message.time}] {chat_message.message} ({chat_message.client_id})")\r\n except Exception as e:\r\n logger.error(e)\r\n\r\n async def run(self):\r\n conn: Redis = await self.redis.get_conn()\r\n pubsub: PubSub = conn.pubsub()\r\n\r\n tasks = [self.publish_handler(conn), self.subscribe_handler(pubsub)]\r\n results = await asyncio.gather(*tasks)\r\n\r\n logger.info(f"Done task: {results}")'), (u'language', u'lang-py'), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efac28d6f50>)])]

This is a common chat server code. Inside the ChatServer class, there is a publish_handler method and a subscribe_handler method. publish_handler serves to publish a message to the chat room (Redis) when a message comes in through the WebSocket. subscribe_handler delivers a message from the chat room (redis) to the connected WebSocket. Both are coroutine methods. Connect redis in run method and run coroutine method.

This brings us to the endpoint. When a request comes in, this code connects to the WebSocket and connects to the chat server.

code_block
[StructValue([(u'code', u'@app.websocket("/ws/{channel_id}/{client_id}")\r\nasync def websocket_endpoint(websocket: WebSocket, channel_id: str, client_id: int):\r\n await manager.connect(websocket)\r\n\r\n chat_server = ChatServer(websocket, channel_id, client_id)\r\n await chat_server.run()'), (u'language', u'lang-py'), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efaa72fb5d0>)])]

Here is the rest of the code. Combined, you get the whole code.

code_block
[StructValue([(u'code', u'import asyncio\r\nimport json\r\nimport logging\r\nimport os\r\nfrom dataclasses import dataclass, asdict\r\nfrom datetime import datetime\r\nfrom typing import List\r\n\r\nimport aioredis\r\nfrom aioredis.client import Redis, PubSub\r\nfrom fastapi import FastAPI, WebSocket\r\n\r\nlogging.basicConfig(level=logging.INFO)\r\nlogger = logging.getLogger(__name__)\r\n\r\napp = FastAPI()\r\n\r\n\r\nclass ConnectionManager:\r\n def __init__(self):\r\n self.active_connections: List[WebSocket] = []\r\n\r\n async def connect(self, websocket: WebSocket):\r\n await websocket.accept()\r\n self.active_connections.append(websocket)\r\n\r\n def disconnect(self, websocket: WebSocket):\r\n self.active_connections.remove(websocket)\r\n\r\n async def send_personal_message(self, message: str, websocket: WebSocket):\r\n await websocket.send_text(message)\r\n\r\n async def broadcast(self, message: dict):\r\n for connection in self.active_connections:\r\n await connection.send_json(message, mode="text")\r\n\r\n\r\nmanager = ConnectionManager()\r\n\r\n\r\n@dataclass\r\nclass ChatMessage:\r\n channel_id: str\r\n client_id: int\r\n time: str\r\n message: str'), (u'language', u'lang-py'), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efa9fd29bd0>)])]

Dockerfile

The following is the Dockerfile for the backend service. Run this application with Uvicorn.

code_block
[StructValue([(u'code', u'FROM python:3.8-slim\r\nWORKDIR /usr/src/app\r\nCOPY requirements.txt ./\r\nRUN pip install -r requirements.txt\r\nCOPY . .\r\nCMD [ "uvicorn", "main:app", "--host", "0.0.0.0" ]'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efac1baa910>)])]

requirements.txt

Put the packages for FastAPI and Redis into requirements.txt.

code_block
[StructValue([(u'code', u'aioredis==2.0.1\r\nfastapi==0.85.0\r\nuvicorn[standard]'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efac1baaed0>)])]

cloudbuild.yaml

The last step is the cloudbuild.yaml file. Just like the frontend service, you can edit the part composed of project_id and backend, and add the IP of the memorystore created at the back into REDIS_HOST.

code_block
[StructValue([(u'code', u"steps:\r\n # Build the container image\r\n - name: 'gcr.io/cloud-builders/docker'\r\n args: ['build', '-t', 'gcr.io/project_id/backend:$COMMIT_SHA', '.']\r\n # Push the container image to Container Registry\r\n - name: 'gcr.io/cloud-builders/docker'\r\n args: ['push', 'gcr.io/project_id/backend:$COMMIT_SHA']\r\n # Deploy container image to Cloud Run\r\n - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'\r\n entrypoint: gcloud\r\n args:\r\n - 'run'\r\n - 'deploy'\r\n - 'backend'\r\n - '--image'\r\n - 'gcr.io/project_id/backend:$COMMIT_SHA'\r\n - '--region'\r\n - 'asia-northeast3'\r\n - '--port'\r\n - '8000'\r\n - '--update-env-vars'\r\n - 'REDIS_HOST=redis://10.87.130.75'\r\n images:\r\n - 'gcr.io/project_id/backend:$COMMIT_SHA'"), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efac2955a50>)])]

Cloud Build

You can set Cloud Build to automatically build and deploy from Cloud Run when the source code is pushed to GitHub. Just select “Create trigger” and enter the required values. First, select “Push to a branch” for Event.

Create trigger - Cloud Build

Next, go to the Source Repository. If this is your first time, you will need GitHub authentication. Our repository also has cloudbuild.yaml, so we also select the “Location” setting as the repository.

Edit Trigger - Cloud Build

Serverless VPC access connector

Since both the Frontend service and the Backend service currently exist in the Internet network, you’ll need a serverless VPC access connector  to connect to the memorystore in the private band. You can do this by following this example code:

code_block
[StructValue([(u'code', u'bash\r\ngcloud compute networks vpc-access connectors create chat-connector \\\r\n--region=us-central1 \\\r\n--network=default \\\r\n--range=10.100.0.0/28 \\\r\n--min-instances=2 \\\r\n--max-instances=10 \\\r\n--machine-type=e2-micro'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efa9dde1550>)])]

Create memorystore

To create the memorystore that will pass chat messages, use this code:

code_block
[StructValue([(u'code', u'bash\r\ngcloud redis instances create myinstance --size=2 --region=us-central1 \\\r\n --redis-version=redis_6_X'), (u'language', u''), (u'caption', <wagtail.wagtailcore.rich_text.RichText object at 0x3efaa4a20190>)])]

chat test

To demonstrate what you should see, we put two users into a conversation in a chat room called “test”. This will work regardless of how many users you have, and users will not see the conversations in other chat rooms until they join.

HashChat room test

Wrap-up

In this article, I built a serverless chat server using Cloud Run. By using Firestore instead of Memorystore, it is also possible to take the entire architecture serverless. Also, since the code is written on a container basis, it is easy to change to another environment such as GKE Autopilot, but Cloud Run is already a great platform for deploying microservices. Instances grow quickly and elastically according to the number of users connecting, so why would I need to choose another platform? Try it out now in the Cloud Console.