Build Powerful and Secure Multiple Outerbase Commands With PNPM Workspace and Vite
A complete guide for getting started with Outerbase Command
Table of contents
- Preparation
- Creating an Outerbase Command to Manipulate Data
- Using PNPM Workspace and Vite
- Integration with 3rd Party Services and Add Security
- Encrypt Secret in Database with Supabase Vault (+ Use Case to Integrate with AWS)
- Protect Endpoint with Token-Based Authentication and JWT (+ Use Case with Clerk)
- Scheduling Job to Remind Sales Person with Upstash
- Add Push Notification If a Deal is Closed with Ntfy
- Automatic Monthly Report Send to Email with Mailgun
- App and Audit Log for Debugging with Axiom
- Conclusion
Note: Outerbase is still in beta by the time this article was written. My approach and solution below may need adjustment when future features and support, such as getting HTTP header requests and exit chaining of nodes, are available.
Note 2: This article is still a work in progress that expected to finish on September 26th. You may see something like <Placeholder for xxx>
or TBA
(To Be Added). Follow me or check periodically to know more updates.
Outerbase is like a dbeaver or phpmyadmin, where you can connect your database, then you can browse tables and query, insert, update and delete your data easily. But not like those two, Outerbase doesn't require you to install anything, you can customize the look and feel (not only table like spreadsheet) and can manipulate the data via Javascript, SQL and Python through RESTful API. The ability to create an API without worrying about backend deployment is called command in Outerbase. In this article, we will try to use the command, deep dive into some usage and current limitations and see what am I prepared for you.
TLDR: I prepared how to build a command with splitting files and 3rd party library (+ GitHub repo template -- check below), avoid writing a hard-coded secret to call 3rd party services, and protect an endpoint so not everyone can access the data. Everything while we create some sample integration. See the demo video below.
<placeholder for demo video>
Github for repo template: outerbase-commands-workspace
Github for complete command with integrations: <placeholder for github url>
In this article, we will mostly talk about the backend side, thus we will talk about Outerbase command, database configuration and integration with several 3rd parties that help us finish our job. List of integrations:
PNPM and Vite for building complex node
Supabase Vault for storing secrets (+ AWS service example)
Token-based and JWT for the protected endpoint (+ Clerk integration)
Qstash from Upstash for the delayed message
Ntfy for Push notification
Mailgun for Monthly report email
Axiom for app log and audit log (help me as developer)
Preparation
Idea and Prepare Some Data
Some weeks ago, I had a chat with my father. He wants to track his salespeople visits to several selected stores in the same region. For that, he needs to have salesperson data, store data and a list of stores that the salesperson needs to visit every day. When salesperson arrive in the store and do their things, they need to check in by filling form and capturing a photo. My father as a manager will see a dashboard that shows an overview of all salespeople's progress in real-time.
For that purpose, I will use a PostgreSQL database provided by Supabase. Connecting Outerbase and Supabase is really easy, but we will look further to make use of both features to the fullest. For this demo, I will use some dummy data that represents actual data. You can see the data in Google Sheet: https://docs.google.com/spreadsheets/d/1gnyEaF6oJ8XNrOWkWzd9ZdwXkA6ivgJMfAF-QTSbijM/edit?usp=sharing
-- Image: Example Data for Our Internal Tool
Prepare Database and Import Data
You can get a PostgreSQL database for free in Supabase. For connecting supabase with Outerbase, you can follow official docs of Outerbase or Follow the steps for more visuals provided by my fellow tech writer.
-- Image: Outerbase Screen After You Connect to a Database
After you get a database instance, now is your time to create some tables and provide some data. Download the Google Sheet above as two CSV files, one for salesperson and one for store. With Supabase, you can import data from CSV. Open your Project -> Database -> + New Table -> (in Columns section) import data via spreadsheet. Just drop the file here, give it a table name, set the type of data and you will see the column and data will be automatically created.
-- Image: Create a Table By Importing a CSV in Supabase
After the import process is finished, we can go back to Outerbase and create some queries. Go to your database and click New -> Query.
-- Image: Outerbase Show Our Newly Created Table. Now We Will Create a Query Manually or Via EZQL
If you are familiar with Metabase or Redash, you normally write your own SQL to get the data. But Outerbase gives us a fresh way to get the data, it's called EZQL, you just write what question you want to ask about the data and it will generate the SQL and give you the data. Just like ChatGPT, but it knows the context of data. Let's give it a try by asking about how many salespersons and stores are in my database and the ratio of store to salesperson.
-- Image: It took me 3 prompts (1 initial, 1 revise and 1 to optimize) until I got the expected result.
-- prompt: I want to know how many I have salesperson, store and ratio of store to salesperson
SELECT COUNT(*) as salesperson_count FROM public.salesperson
-- prompt: well, that only count the salesperson, I also want to see store and the ratio
SELECT
(SELECT COUNT(*) FROM public.salesperson) AS salesperson_count,
(SELECT COUNT(*) FROM public.store) AS store_count,
(
SELECT COUNT(*)::float / (SELECT COUNT(*) FROM public.salesperson) FROM public.store
) AS store_salesperson_ratio
-- prompt: That cool! But I see we query each table twice. Can't we do that only once?
SELECT salesperson_count, store_count, store_count::float / salesperson_count AS store_salesperson_ratio
FROM
(
SELECT COUNT(*) AS salesperson_count
FROM public.salesperson
) sp,
(
SELECT COUNT(*) AS store_count
FROM public.store
) st
Great! So I know there is 30 salesperson and each one of them needs to handle 16-17 stores, not ideal BTW. Side note: I also ask ChatGPT to revise my first SQL, but there is no luck since it doesn't understand the table schema.
Okay, let's go with our next step. We need to create an agenda table that stores a list of salesperson visits. Let's give EZQL a try.
-- prompt: create a new table called "agenda" that have relation to salesperson and store,
-- and also add a columns: visit_date, status. And don't forget uuid as primary key id.
CREATE TABLE IF NOT EXISTS agenda (
id uuid PRIMARY KEY,
salesperson_id uuid REFERENCES public.salesperson(Id),
store_id uuid REFERENCES public.store(Id),
visit_date date,
status varchar(255)
);
Looks good, but unfortunately, that returns an error. By the time this article was written, Outerbase hang and didn't throw an error. But, I believe they will fix it soon before graduate from beta version. Fortunately, I can get the error log from Supabase.
It took me some time to resolve the error, by using EZQL, ChatGPT, StackOverflow and in the end, I resolved it by making my hand dirty with direct trial error with SQL Query. The solution is to use quote to Id
in reference definition. So, instead of public.salesperson(Id)
I need to write public.salesperson("Id")
. Maybe that's because I use capital in the column name (since the CSV is like that and I was too lazy to change it), please leave a comment if you know about this.
Removing Unused Table from Outerbase
Supabase already created some tables for it's internal and we don't need it for our app. And because we use a root account (postgres) for Outerbase, it will get all schema and tables. For my non-technical father, it will be too much!
You may need to read about Access Roles to understand how Outerbase will behave with your data. TLDR, we will create new Postgres roles (accounts) that have access only to our table.
create role "outerbase" login password 'your-secret-password';
grant insert, select, update on all tables in schema public to outerbase;
After that, we can change the connection setting in Outerbase (In your project, go to Settings -> Database) and see if we only see tables for our app.
-- Image: Before you will see all schema, but after login with new roles and granting only specific permission, the table will be neater.
Notes: You may want to disable RLS (Row Level Security) or create a new policy for Outerbase to access the data. If not, you will not be able to get the data.
Creating an Outerbase Command to Manipulate Data
Okay, we have data and access to it via query. Now is the most interesting part: creating a logic and manipulating our data with Outerbase Command.
Hello World for Command
First of all, go create a new command.
A command is a single endpoint path (e.g. /hello
) that can be called using GET, POST, PUT and DELETE verbs. After we hit the endpoint, a node will be processing the request. If we have more than one node, it will execute the next node -- let's call it node-1
, node-2
, node-3
, and so on -- sequentially until the last node returns a response to the user. node-2
can get a return value from node-1
and node-3
can get return values from node-1
and node-2
. All nodes can get data from request.body
and request.query
.
To make you understand the process, I created a simple hello world for testing the result.
// Node name: Node 1
function userCode() {
return {
from: "node-1",
body: {{request.body}},
query: {{request.query.token}}
}
}
// Node name: My Custom Name
function userCode() {
return {
from: "node-2",
body: {{request.body}},
query: {{request.query.token}},
"node-1": {
full: {{node-1}},
from: {{node-1.from}},
body: {{node-1.body}},
query: {{node-1.query}}
}
}
}
// Node name: Node 3
function userCode() {
return {
from: "node-3",
body: {{request.body}},
query: {{request.query.token}},
"node-1": {
full: {{node-1}},
from: {{node-1.from}},
body: {{node-1.body}},
query: {{node-1.query}}
},
"node-2": {
full: {{my-custom-name}},
from: {{my-custom-name.from}},
body: {{my-custom-name.body}},
query: {{my-custom-name.query}}
}
}
}
Keep in mind, that's not a valid javascript so you will find your linter will complain. Don't worry, we let Outerbase replace the variable before executing it as a valid javascript. See my workaround using PNPM workspace and vite in the below section for the solution.
And here is the result
{
"from": "node-3",
"body": "",
"query": "test",
"node-1": {
"full": "{\"from\":\"node-1\",\"body\":\"\",\"query\":\"test\"}",
"from": "node-1",
"body": "",
"query": "test"
},
"node-2": {
"full": "{\"from\":\"node-2\",\"body\":\"\",\"query\":\"test\",\"node-1\":{\"full\":\"{\\\"from\\\":\\\"node-1\\\",\\\"body\\\":\\\"\\\",\\\"query\\\":\\\"test\\\"}\",\"from\":\"node-1\",\"body\":\"\",\"query\":\"test\"}}",
"from": "node-2",
"body": "",
"query": "test"
}
}
As you can see if you directly refer to {{node-1}}
, the return will be a string of JSON. So you need to parse it first using JSON.parse({{node-1}})
. And make sure there is no quote before and after {{node-1}}
since it will break the template mechanism by Outerbase.
And if you see directly, if I rename a node, then I will need to rename how I call it. For example, I call the second node {{my-custom-name}}
, not {{node-2}}
, since the name of the second node is My Custom Name
.
Accessing Data with SQL
We can use any SQL command and add input from request data or previous nodes. The use case is not only for reading data from db, but we can also insert, update or delete data.
Hit the endpoint /hello/world?token=test&limit=2
will bring the result of the SQL node like this
{
"success": true,
"response": {
"items": [
{
"Id": "7036b866-e2f8-4b34-a7d8-88e0b6e523db",
"Name": "Erin Vargas",
"Email": "oscott@example.org",
"Phone": "+6281267188370",
"Avatar": "https://i.pravatar.cc/100?u=oscott@example.org"
},
{
"Id": "42a4331c-8e85-43a6-afbf-1791b963e909",
"Name": "Dr. Susan Combs DDS",
"Email": "victoria95@example.com",
"Phone": "+6281280966669",
"Avatar": "https://i.pravatar.cc/100?u=victoria95@example.com"
}
],
"schema": false
}
}
We can use this result for the next node, in this case, I use Javasript node. Please keep in mind if you pass directly from {{node-name}}
then you need to parse it first. For that, usually, I create a helper function like this:
function parseResult(resultStr) {
try {
const result = JSON.parse(resultStr)
if (!result.success) return null
return result.response.items
} catch {
// For now, if the SQL throw error, the return is not a valid JSON
// so we will catch it here
// bug fix in progress by Outerbase team
return null
}
}
function parseResultOne(resultStr) {
const result = parseResult(resultStr)
if (result === null) return null
return result.response.items[0]
}
function userCode() {
const result = parseResultOne({{node-1}})
return
}
--
A little bit about SQL Injection. I have very little knowledge about that so please take it with a grain of salt. Assume we have this SQL: SELECT {{request.query.limit}} as limit, {{request.query.token}} as token;
Hitting this endpoint /hello/world?token=test&limit=(select "Id" from salesperson limit 1)
will not return the data from salesperson, instead, it returns as a literal string.
{
"success": true,
"response": {
"items": [
{
"limit": "(select \"Id\" from salesperson limit 1)",
"token": "test"
}
],
"schema": false
}
}
Limitation Using This Approach
For a normal backend, we most likely use more than one endpoint path (command). For a single command, we will have more than one node. Managing all of that by writing directly in the Outerbase will be risky, e.g. incidentally deleting a node or command.
We can't use 3rd party library or split our code into several files
We can't reuse the same code
The template format
{{request.body}}
will break the linter/syntax highlighter/IntelliSense in VSCode or your favorite IDE
These problems made me create a solution in the next section.
Using PNPM Workspace and Vite
First of all, I already create a template you can work for so you don't have to prepare it yourself. Go to this Github repo outerbase-commands-workspace and click use this template
(don't forget to star ๐). The next 2 subsections will talk about how the template works internally after that we will learn how to use it in real code.
PNPM Workspace for Multiple Command and Nodes
I use pnpm instead of npm and yarn, and pnpm workspace is how we built a monorepo (don't confuse it with monolithic). With monorepo, we can put all of our code for command and node in a single place. Interestingly, we can also share functions to be used by some nodes without maintaining another repo (for that lib). Using pnpm, we also save some storage since the library will be symlinked instead of installing node_modules for each of our nodes.
After using the template and cloning your repo, you can run pnpm install
to initialize. Here is the folder structure
$ tree -L 1
.
โโโ commands # the source code
โโโ dist # the build code
โโโ node_modules
โโโ package.json
โโโ pnpm-lock.yaml
โโโ pnpm-workspace.yaml # define where is our source code
โโโ README.md
โโโ scripts # custom bash script and templates
And here are the available commands
$ # create a new code
$ pnpm run new COMMAND_NAME NODE_NAME
$ # install dependency from package.json
$ pnpm --filter "COMMAND_NAME-NODE_NAME" install
$ # build into a single file
$ # output: COMMAND_NAME-NODE_NAME.js
$ pnpm --filter "COMMAND_NAME-*" run build
$ # build into a single file and minify (but preserve "userCode" function name)
$ # output: COMMAND_NAME-NODE_NAME.min.js
$ pnpm --filter "COMMAND_NAME-*" run build-minify
$ # build into a single file and minify and remove quote for {{request.*}} and {{node-*}}
$ # output: COMMAND_NAME-NODE_NAME.min.js.outerbase
$ pnpm --filter "COMMAND_NAME-*" run build-minify-outerbase
$ # minify an existing build that not minified yet
$ pnpm run minify COMMAND_NAME-NODE_NAME # without extension
$ # convert to outerbase compatible template (remove quote for {{request.*}} and {{node-*}}) for existing build
$ pnpm run convert-to-outerbase COMMAND_NAME-NODE_NAME.js # with extension
Running pnpm run xxx
will see in package.json
the root and run xxx
script. But running pnpm --filter "COMMAND_NAME-*" run xxx
will see package.json
in all matched project names and run xxx
script defined there.
Using pnpm run new
actually runs a bash script located in scripts/new_command.sh
which copies scripts/templates
folder to the commands/
folder and make some changes for command and node names in the template. You can modify the content of the template and it will be included in your next generated code.
Vite for Spliting Files and Use 3rd Party Lib
I use Vite for bundling the code (actually, rollup does the work). Let's try to create a hello world for that: pnpm run new hello node-1
. After that, we will have a new folder in command/hello-node-1
. Let's see what we have there.
$ tree commands/hello-node-1/
commands/hello-node-1/
โโโ package.json
โโโ src
โ โโโ index.js
โ โโโ world.js
โโโ vite.config.ts
1 directory, 4 files
$ tail commands/hello-node-1/src/*
==> commands/hello-node-1/src/index.js <==
import world from "./world"
export function userCode() {
console.log = "test"
return {
status: "success",
message: `hello ${world()}`,
msg: "{{request.query.INPUT_NAME}}"
};
}
==> commands/hello-node-1/src/world.js <==
export default function user() { return "Tegar"}
In the sample code, I prepared index.js
for our entry point that will import a local file world.js
. You may notice that I add a quote in {{request.query.INPUT_NAME}}
, that will be broken if we use it directly as a Outerbase node. But if we do not add the quote, it will be broken in our IDE since it's not a valid javascript syntax. So my solution is we will add the quote in our development environment and then run script/outerbase_compatible.sh
to remove the double quote in the build file. Don't worry, I already simplified it as pnpm run script like previously I mentioned.
Let's add a 3rd party dependency, build it and deploy it to Outerbase. I will pnpm --filter "hello*" install is-odd
and make some changes
import world from "./world"
+ import isOdd from "is-odd"
export function userCode() {
console.log = "test"
return {
status: "success",
message: `hello ${world()}`,
- msg: "{{request.query.INPUT_NAME}}"
+ input: isOdd(parseInt("{{request.query.number}}"))
};
}
After that, run pnpm --filter "hello*" run build-minify-outerbase
and copy the content of dist/hello-node-1.min.js.outerbase
. Yes, it's only a few lines and all functions will be renamed except for the userCode
. We also removed the quote for the Outerbase template variable.
Paste the code to the Outerbase command and deploy it by clicking Run All
. You can run the code by create a GET request to /hello/world?number=3
. I usually wait until the log shows me something like this to make sure it's deployed successfully.
11:28:52:325 Starting to build Spin compatible module
11:28:52:325 Preinitiating using Wizer
11:28:54:070 Optimizing wasm binary using wasm-opt
11:29:10:357 Spin compatible module built successfully
Create More Useful Commands and Nodes using This Repo
Now let's try to create a node that will create an agenda for salespeople to visit several stores on specific dates. We will create a POST endpoint, validate the input and store it in the database.
TBA
Integration with 3rd Party Services and Add Security
Encrypt Secret in Database with Supabase Vault (+ Use Case to Integrate with AWS)
More about Supabase Vault can be read here. In short, we encrypt our secret in the database and just in case someone gets access to our database backup, they only get the encrypted data. To receive the data, we can access a VIEW TABLE and get the decrypted data. Actually, we can achieve this using built-in encrypt and decrypt functions like this example, but I found that using Supabase Vault is easier (to create and delete the secret) and simpler (in SQL form).
First of all, go to your Supabase project and go to Settings -> Vault -> Add new secret. Here you will get a form to create a secret. For example to integrate with AWS Services:
After that, we can get the secret by using this SQL
-- Get from one key
select name as key, decrypted_secret as value
from vault.decrypted_secrets where name = 'AWS_ACCESS_KEY_ID';
-- Get from multiple keys
select name as key, decrypted_secret as value
from vault.decrypted_secrets where name in
('AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY');
The output will be something like this
{
"success": true,
"response": {
"items": [
{
"key": "AWS_ACCESS_KEY_ID",
"value": "AKIAX7Y4GCQPDK397AVM"
}
],
"schema": false
}
}
Don't use this as the last node in the command except for debugging, or else the secret will be leaked to the user
Usually, I make a wrapper so I can access it as a key-value pair. parseResult
is from the previous section accessing data with SQL.
// Node name: Get Secret
function userCode() {
const secretStr = {{get-secret-node}}
const secret = parseResult(secretStr); // Refer to previous section
const secretKV = secret.reduce((obj, item) => {
obj[item.key] = item.value;
return obj;
}, {});
return secretKV
}
// In other node
const config = new AWS.Config({
accessKeyId: {{get-secret.AWS_ACCESS_KEY_ID}},
secretAccessKey: {{get-secret.AWS_SECRET_ACCESS_KEY}},
region: 'us-east-1'
});
The example of the AWS config is from the official documentation. You can use your imagination to use AWS Services.
While this is possible, I still prefer to add secrets in environment variables like how we usually do and comply with the 12-factor app. I hope Outerbase will support this soon.
Protect Endpoint with Token-Based Authentication and JWT (+ Use Case with Clerk)
Since Outerbase command still does not support built-in Authentication nor header and cache, then we don't have many options to make sure our endpoint can respond only to authorized requests. For this, I will use token
key in the query string. So for each command, I will expect something like this in the first node:
// Node name: "Check Auth"
function userCode() {
// let's hard code the token for now
// and we will modify later
if ({{request.query.token}} === "letmein") {
return { authorized: true }
} else {
return { authorized: false }
}
}
Unfortunately, by the time this article was written, Outerbase didn't support exiting early without continuing to chain the node. So I created a small lib to check authorization and wrap the main function with it. I do this for all nodes after the Check Auth
node. Assume you are using the template repo I provided:
// lib/checkAuth.js
export default function checkAuth(fn) {
// The build step will remove the quote
// Make sure your auth node name is "Check Auth" or else it doesn't work
if ("{{check-auth.authorized}}") {
return { status: "error", code: 403, message: "Unauthorized" }
else {
return fn()
}
}
// Import to main.js
import checkAuth from "@/lib/checkAuth"
function main() {
// Do something heavy
}
function userCode() {
return checkAuth(main)
}
So later if Outerbase finally supports it, I just simply remove the import and wrapper function.
For Skiping SQL Node
if
statement and I think we can use this, but it's for procedural programming language (like you create a function that will run in postgres). I think it's wiser to wait Outerbase team to support this feature rather than go for unnecessary complexity.Back to the token, the token checker is hard-coded and that's not ideal. We can use our previous about encrypt secret in database and then use it. It's very simple to implement but now since the token is only one, we can't differentiate who makes a call. It's not suitable for apps that are used by multiple users.
And now for that scenario, we can use a different approach: JWT. We can create our own JWT using JWT libraries from your favorite programming language. We can use Auth services like Clerk or Auth0, but in this article, we will focus on using Clerk. The idea is the user will sign in to clerk -> clerk generates JWT -> client side (browser) will create a request with JWT -> Outerbase command in Check Auth
node will verify whether it's a valid JWT or not.
If you want to read more about the JWT verification method.
Normally, we will retrieve the JWT from the request header or session. But for our approach now, we will expect the client to send the JWT via query string.
For getting JWT from Clerk, you can see the code I use from stackblitz here. (Refresh the stackblitz if shows error)
For verifying JWT, you can refer to the Clerk documentation about manual verification. This is a modified version that is simplified for our use case. Since this needs 3rd party library, you may need the build tool.
// Node name: "Check Auth"
import jwt from "jsonwebtoken";
function userCode() {
const token = "{{request.query.token}}"
// Get from API Key -> Advanced -> JWT Public Key -> PEM
const publicKey = `-----BEGIN PUBLIC KEY-----
....
-----END PUBLIC KEY-----`;
try {
const decoded = jwt.verify(token, publicKey);
return { authorized: true, sessToken: decoded }
} catch (error) {
return { authorized: false, error: "Invalid Token" }
}
}
Too complicated? Actually, you can use this API from Clerk to verify the token from the server side. But I'm not a fan of this method since it needs time to call API (and another time if you also retrieve secret token for Clerk from the database secret)
Scheduling Job to Remind Sales Person with Upstash
TBA
Add Push Notification If a Deal is Closed with Ntfy
TBA
Automatic Monthly Report Send to Email with Mailgun
TBA
App and Audit Log for Debugging with Axiom
TBA
Conclusion
That's all my exploration with Outerbase command. It's not a complete app yet, in the next post we will see how I created a dashboard and plugin to make my father oversee his salesperson.
Thank you Hashnode and Outerbase for hosting a Hackathon. That makes me accelerate my development process to finish this dormant idea.
I usually talk about rapid development and love to review Platform as a Service that helps developers' lives easier. Connect with me on Twitter and Linkedin to talk more about that.