Deployment
Deno Deploy
Raptor was built with Deno in mind, meaning that deploying to Deno Deploy is easy. Inside your deno.json configuration file, add the following configuration.
"deploy": {
"org": "[your-org]",
"app": "[app-name]"
}Once you've added the deployment information, you can run deploy via the command line.
deno deployCloudflare Worker
You can also deploy your application to a Cloudflare Worker by simply using the respond method of the Kernel. This allows you to bypass the usual serve method and return a Response directly.
/* ... */
export default {
fetch: (request: Request) => {
return kernel.respond(request);
},
};Once you've setup your application for Cloudflare Workers, you can run the deployment as usual:
dx wrangler deploynpx wrangler deployAWS Lambda
Native Deployment (Node.js)
Deploy your Raptor application to AWS Lambda with minimal setup. Install the Lambda adapter package, pass your Kernel instance to it, and export the result as your handler.
npx jsr add @raptor/lambdayarn dlx jsr add @raptor/lambdapnpm dlx jsr add @raptor/lambdaimport lambda from "@raptor/lambda";
const app = new Kernel();
// ...
export const handler = lambda(app);Docker Deployment (Deno or Bun)
Prefer Deno or Bun? You can run Raptor on either runtime using a Docker image deployed to Lambda. The example below targets Deno, but the same approach applies to Bun with the appropriate base image.
For full details on deploying container images to Lambda, refer to the AWS documentation.
FROM denoland/deno:2.7.11
WORKDIR /app
COPY deno.json .
COPY raptor.config.ts .
COPY app ./app
COPY config ./config
COPY bin ./bin
RUN deno cache ./bin/index.ts
EXPOSE 8000
CMD ["deno", "run", "--allow-all", "./bin/index.ts"]