Same API in Python, Node, Rust, and Go

I built the same small API in nine frameworks across four languages. A /health endpoint, a GET /items/:id returning JSON, and a POST /items accepting a JSON body, sized to expose how each framework handles the basics.

This isn’t a benchmark post. It covers setup friction, type safety, error handling patterns, and how much the framework gets out of your way while writing the code.

The API

Every implementation exposes the same three endpoints:

  • GET /health - returns {"status": "ok"}
  • GET /items/:id - returns an item by ID or 404
  • POST /items - accepts {"name": "...", "price": ...}, returns the created item with an ID

Items are stored in memory. No database, no auth, no middleware beyond what’s needed to parse JSON.

Python - FastAPI

FastAPI is the obvious choice for Python APIs now. Flask still works, Django REST Framework is fine for large apps, but FastAPI won the ergonomics race.

from fastapi import FastAPI, HTTPException
from pydantic import BaseModel

app = FastAPI()

class ItemCreate(BaseModel):
    name: str
    price: float

class Item(BaseModel):
    id: int
    name: str
    price: float

items: dict[int, Item] = {}
next_id = 1

@app.get("/health")
def health():
    return {"status": "ok"}

@app.get("/items/{item_id}")
def get_item(item_id: int):
    if item_id not in items:
        raise HTTPException(status_code=404, detail="Item not found")
    return items[item_id]

@app.post("/items", status_code=201)
def create_item(body: ItemCreate):
    global next_id
    item = Item(id=next_id, name=body.name, price=body.price)
    items[next_id] = item
    next_id += 1
    return item

Run it with uv run uvicorn main:app (or uvicorn main:app if you’re managing your own venv) and you get automatic OpenAPI docs at /docs. The Pydantic models validate request bodies and serialise responses. Type hints drive everything - the framework reads them to generate validation, documentation, and route parameter parsing.

The global next_id is ugly but this is a demo. In production you’d use a database and never think about it.

What’s good: Fastest path from nothing to a working, documented API. Pydantic rejects malformed requests with clear error messages before your code runs. Auto-generated OpenAPI docs at /docs save time when other people need to use your API.

What’s less good: Slower than the compiled options. Type hints are just hints. Deployment needs Docker + a runtime.

Node.js - Express

Express is the framework most people learn first. It dates from 2010 and the age shows: slower than the alternatives, security defaults that need tightening (no helmet, no rate limiting out of the box), and req.body is any with no validation. Deno and Bun have tried to address some of these fundamentals at the runtime level, but Express itself hasn’t changed much. Adding Zod at least fixes the validation gap.

import express from "express";
import { z } from "zod";

const app = express();
app.use(express.json());

const ItemCreate = z.object({
  name: z.string(),
  price: z.number(),
});

interface Item {
  id: number;
  name: string;
  price: number;
}

const items = new Map<number, Item>();
let nextId = 1;

app.get("/health", (_req, res) => {
  res.json({ status: "ok" });
});

app.get("/items/:id", (req, res) => {
  const item = items.get(Number(req.params.id));
  if (!item) {
    res.status(404).json({ error: "Item not found" });
    return;
  }
  res.json(item);
});

app.post("/items", (req, res) => {
  const parsed = ItemCreate.safeParse(req.body);
  if (!parsed.success) {
    res.status(400).json({ error: parsed.error.flatten() });
    return;
  }
  const item: Item = { id: nextId++, ...parsed.data };
  items.set(item.id, item);
  res.status(201).json(item);
});

app.listen(3000);

Zod gives you runtime validation with good error messages. Post {"name": 42, "price": "banana"} and you get back field-level errors explaining what’s wrong. Without it, Express would happily store garbage.

What’s good: Everyone knows it. Huge package selection on npm. Zod integration is straightforward and gives you Pydantic-level validation.

What’s less good: You have to bring your own validation - nothing is built in. Error handling is callback-based (next(err)) which feels dated. Express 5 has been in beta for years.

Node.js - Hono

Hono is the modern alternative. It runs everywhere - Node, Deno, Bun, Cloudflare Workers, Lambda@Edge. It’s fast and the API is clean.

import { Hono } from "hono";
import { serve } from "@hono/node-server";
import { z } from "zod";

const app = new Hono();

const ItemCreate = z.object({
  name: z.string(),
  price: z.number(),
});

interface Item {
  id: number;
  name: string;
  price: number;
}

const items = new Map<number, Item>();
let nextId = 1;

app.get("/health", (c) => c.json({ status: "ok" }));

app.get("/items/:id", (c) => {
  const item = items.get(Number(c.req.param("id")));
  if (!item) return c.json({ error: "Item not found" }, 404);
  return c.json(item);
});

app.post("/items", async (c) => {
  const parsed = ItemCreate.safeParse(await c.req.json());
  if (!parsed.success) return c.json({ error: parsed.error.flatten() }, 400);
  const item: Item = { id: nextId++, ...parsed.data };
  items.set(item.id, item);
  return c.json(item, 201);
});

serve(app);

The context object c replaces Express’s separate req and res. Everything flows through it - params, body parsing, response building. It’s a small thing but it makes the code read better. Zod works the same way here as in Express - safeParse the body and handle the error.

What’s good: Runs anywhere. Small bundle. The c.json(data, status) pattern is cleaner than Express’s chainable API. Good TypeScript support with typed routes. Hono also has a built-in validator middleware that integrates Zod directly into the route definition.

What’s less good: Fewer third-party packages than Express. Some edge runtime limitations depending on where you deploy.

Node.js - Fastify

Fastify was built to be a faster Express with better defaults. It has built-in JSON Schema validation - no extra library needed.

import Fastify from "fastify";

const app = Fastify();

interface Item {
  id: number;
  name: string;
  price: number;
}

const items = new Map<number, Item>();
let nextId = 1;

app.get("/health", async () => ({ status: "ok" }));

app.get<{ Params: { id: string } }>(
  "/items/:id",
  {
    schema: {
      params: {
        type: "object",
        properties: { id: { type: "string" } },
      },
    },
  },
  async (req, reply) => {
    const item = items.get(Number(req.params.id));
    if (!item) {
      reply.status(404);
      return { error: "Item not found" };
    }
    return item;
  }
);

app.post<{ Body: { name: string; price: number } }>(
  "/items",
  {
    schema: {
      body: {
        type: "object",
        required: ["name", "price"],
        properties: {
          name: { type: "string" },
          price: { type: "number" },
        },
      },
    },
  },
  async (req, reply) => {
    const { name, price } = req.body;
    const item: Item = { id: nextId++, name, price };
    items.set(item.id, item);
    reply.status(201);
    return item;
  }
);

app.listen({ port: 3000 });

More verbose than the others, but validation is built in with no extra dependencies. Post a string where a number should be and Fastify rejects it before your handler runs.

What’s good: Built-in validation via JSON Schema. Faster than Express. Auto-generated Swagger docs with @fastify/swagger. Async by default.

What’s less good: JSON Schema is verbose compared to Zod or Pydantic. The TypeScript generics get unwieldy on complex routes. More ceremony than Hono for simple cases.

Node.js - Lambda

No framework at all. Just a Lambda handler behind API Gateway, routing on the raw path and method. This is what you get when you strip everything back.

import { APIGatewayProxyEventV2, APIGatewayProxyResultV2 } from "aws-lambda";
import { z } from "zod";

const ItemCreate = z.object({
  name: z.string(),
  price: z.number(),
});

interface Item {
  id: number;
  name: string;
  price: number;
}

const items = new Map<number, Item>();
let nextId = 1;

export const handler = async (
  event: APIGatewayProxyEventV2
): Promise<APIGatewayProxyResultV2> => {
  const { method } = event.requestContext.http;
  const path = event.rawPath;

  if (method === "GET" && path === "/health") {
    return json(200, { status: "ok" });
  }

  const itemMatch = path.match(/^\/items\/(\d+)$/);
  if (method === "GET" && itemMatch) {
    const item = items.get(Number(itemMatch[1]));
    if (!item) return json(404, { error: "Item not found" });
    return json(200, item);
  }

  if (method === "POST" && path === "/items") {
    const parsed = ItemCreate.safeParse(JSON.parse(event.body ?? "{}"));
    if (!parsed.success) return json(400, { error: parsed.error.flatten() });
    const item: Item = { id: nextId++, ...parsed.data };
    items.set(item.id, item);
    return json(201, item);
  }

  return json(404, { error: "Not found" });
};

function json(statusCode: number, body: unknown): APIGatewayProxyResultV2 {
  return {
    statusCode,
    headers: { "content-type": "application/json" },
    body: JSON.stringify(body),
  };
}

No router, no middleware, no app.listen(). You parse the event yourself, and the routing is if/else on strings. This is what every framework abstracts over.

The in-memory store won’t survive between Lambda invocations in production (each invoke can be a fresh container), but for this comparison the logic is the same.

What’s good: Zero framework overhead, so cold starts have nothing to initialise. You control every detail of the response, and it works with API Gateway v2 (HTTP API) out of the box.

What’s less good: Manual routing scales poorly past a handful of endpoints. No middleware chain. You’re building your own json() helper on day one.

Rust - Actix Web

Actix Web is one of the most popular Rust web frameworks. More code than the others, but the compiler catches bugs that other languages leave for runtime.

use actix_web::{get, post, web, App, HttpResponse, HttpServer};
use serde::{Deserialize, Serialize};
use std::sync::Mutex;

#[derive(Serialize, Clone)]
struct Item {
    id: u32,
    name: String,
    price: f64,
}

#[derive(Deserialize)]
struct ItemCreate {
    name: String,
    price: f64,
}

struct AppState {
    items: Mutex<Vec<Item>>,
    next_id: Mutex<u32>,
}

#[get("/health")]
async fn health() -> HttpResponse {
    HttpResponse::Ok().json(serde_json::json!({"status": "ok"}))
}

#[get("/items/{id}")]
async fn get_item(
    state: web::Data<AppState>,
    path: web::Path<u32>,
) -> HttpResponse {
    let items = state.items.lock().unwrap();
    let id = path.into_inner();
    match items.iter().find(|i| i.id == id) {
        Some(item) => HttpResponse::Ok().json(item),
        None => HttpResponse::NotFound().json(
            serde_json::json!({"error": "Item not found"})
        ),
    }
}

#[post("/items")]
async fn create_item(
    state: web::Data<AppState>,
    body: web::Json<ItemCreate>,
) -> HttpResponse {
    let mut items = state.items.lock().unwrap();
    let mut next_id = state.next_id.lock().unwrap();
    let item = Item {
        id: *next_id,
        name: body.name.clone(),
        price: body.price,
    };
    *next_id += 1;
    items.push(item.clone());
    HttpResponse::Created().json(item)
}

#[actix_web::main]
async fn main() -> std::io::Result<()> {
    let state = web::Data::new(AppState {
        items: Mutex::new(Vec::new()),
        next_id: Mutex::new(1),
    });
    HttpServer::new(move || {
        App::new()
            .app_data(state.clone())
            .service(health)
            .service(get_item)
            .service(create_item)
    })
    .bind("0.0.0.0:3000")?
    .run()
    .await
}

Most of the extra code is about shared state. The Mutex wrapping is because Actix runs across multiple threads - the compiler forces you to handle concurrent access. In Python or Node you’d just mutate a global dict. Node’s single-threaded model saves you, but only until you hit async mutations.

Serde handles JSON serialisation and deserialisation. Derive Serialize on your output types, Deserialize on your input types, and the framework does the rest. Invalid JSON or missing fields get rejected automatically with proper error responses.

What’s good: If it compiles, the API handles concurrent requests correctly. Serde is as good as Pydantic for validation. Single-digit microsecond response times. The binary deploys anywhere with no runtime.

What’s less good: Rust is harder than everything else on this list. Compile times are slow, the Mutex dance adds noise, and lifetimes will fight you as complexity grows. Fewer crates available than packages on npm or pip.

Rust - Axum

Axum is the other major Rust web framework. Built by the Tokio team, it’s async-first and sits directly on top of Tokio and Tower. Handlers are plain async functions and extractors pull data from requests using types.

use axum::{
    extract::{Path, State},
    http::StatusCode,
    routing::{get, post},
    Json, Router,
};
use serde::{Deserialize, Serialize};
use std::sync::{Arc, Mutex};

#[derive(Serialize, Clone)]
struct Item {
    id: u32,
    name: String,
    price: f64,
}

#[derive(Deserialize)]
struct ItemCreate {
    name: String,
    price: f64,
}

struct AppState {
    items: Mutex<Vec<Item>>,
    next_id: Mutex<u32>,
}

type Shared = State<Arc<AppState>>;

async fn health() -> Json<serde_json::Value> {
    Json(serde_json::json!({"status": "ok"}))
}

async fn get_item(
    State(state): Shared,
    Path(id): Path<u32>,
) -> Result<Json<Item>, StatusCode> {
    let items = state.items.lock().unwrap();
    items
        .iter()
        .find(|i| i.id == id)
        .cloned()
        .map(Json)
        .ok_or(StatusCode::NOT_FOUND)
}

async fn create_item(
    State(state): Shared,
    Json(body): Json<ItemCreate>,
) -> (StatusCode, Json<Item>) {
    let mut items = state.items.lock().unwrap();
    let mut next_id = state.next_id.lock().unwrap();
    let item = Item {
        id: *next_id,
        name: body.name,
        price: body.price,
    };
    *next_id += 1;
    items.push(item.clone());
    (StatusCode::CREATED, Json(item))
}

#[tokio::main]
async fn main() {
    let state = Arc::new(AppState {
        items: Mutex::new(Vec::new()),
        next_id: Mutex::new(1),
    });
    let app = Router::new()
        .route("/health", get(health))
        .route("/items/{id}", get(get_item))
        .route("/items", post(create_item))
        .with_state(state);

    let listener =
        tokio::net::TcpListener::bind("0.0.0.0:3000")
            .await
            .unwrap();
    axum::serve(listener, app).await.unwrap();
}

Compare the get_item handler to Actix. In Axum you return Result<Json<Item>, StatusCode> and the framework handles the response. No HttpResponse::Ok().json() or HttpResponse::NotFound().json() - the types do the work. The .cloned().map(Json).ok_or(StatusCode::NOT_FOUND) chain is idiomatic Rust and reads cleanly once you’re used to it.

Routing is also different. Actix uses attribute macros (#[get("/health")]), Axum uses a builder (Router::new().route("/health", get(health))). Axum’s approach keeps routing in one place.

What’s good: Cleaner handler signatures than Actix. The extractor pattern (Path(id), Json(body), State(state)) keeps handlers focused on logic. Same Serde validation, same performance, same single-binary deploy.

What’s less good: Same Rust trade-offs - compile times, learning curve, Mutex for shared state. Error handling with custom types gets complex fast.

Go - Standard Library

Go doesn’t need a framework for a simple API. The standard library’s net/http package, plus the encoding/json package, covers everything here. Go 1.22 added pattern matching to the default router, so you don’t even need a third-party router anymore.

package main

import (
    "encoding/json"
    "net/http"
    "strconv"
    "sync"
)

type Item struct {
    ID    int     `json:"id"`
    Name  string  `json:"name"`
    Price float64 `json:"price"`
}

type ItemCreate struct {
    Name  string  `json:"name"`
    Price float64 `json:"price"`
}

type M = map[string]string

var (
    items  = make(map[int]Item)
    nextID = 1
    mu     sync.Mutex
)

func j(w http.ResponseWriter, status int, v any) {
    w.Header().Set("Content-Type", "application/json")
    w.WriteHeader(status)
    json.NewEncoder(w).Encode(v)
}

func main() {
    mux := http.NewServeMux()

    mux.HandleFunc("GET /health",
        func(w http.ResponseWriter, r *http.Request) {
            j(w, 200, M{"status": "ok"})
        })

    mux.HandleFunc("GET /items/{id}",
        func(w http.ResponseWriter, r *http.Request) {
            id, err := strconv.Atoi(r.PathValue("id"))
            if err != nil {
                j(w, 400, M{"error": "invalid id"})
                return
            }
            mu.Lock()
            item, ok := items[id]
            mu.Unlock()
            if !ok {
                j(w, 404, M{"error": "not found"})
                return
            }
            j(w, 200, item)
        })

    mux.HandleFunc("POST /items",
        func(w http.ResponseWriter, r *http.Request) {
            var body ItemCreate
            err := json.NewDecoder(r.Body).Decode(&body)
            if err != nil {
                j(w, 400, M{"error": "invalid json"})
                return
            }
            mu.Lock()
            item := Item{nextID, body.Name, body.Price}
            items[nextID] = item
            nextID++
            mu.Unlock()
            j(w, 201, item)
        })

    http.ListenAndServe(":3000", mux)
}

No dependencies, and go run main.go runs it.

The code is verbose but there’s nothing hidden. Every error is handled explicitly. Every header is set manually. You can read this top to bottom and know exactly what happens on each request.

The sync.Mutex appears again, like in Rust. Go’s goroutine-per-request model means concurrent access is real and you need to handle it. Unlike Rust, the compiler won’t stop you if you forget the lock - that’s a runtime data race waiting to happen.

What’s good: Zero dependencies. Fast compilation. The binary runs anywhere. The standard library covers most API use cases without reaching for a framework. Single binary, small container images.

What’s less good: Verbose error handling (if err != nil everywhere). Struct tags for JSON mapping are a bit magical. No built-in validation beyond JSON parsing - in production you’d add go-playground/validator for struct tag validation, similar to how Express needs Zod.

Go - Fiber

Fiber is Go’s answer to Express. It’s built on fasthttp instead of net/http and the API will feel familiar if you’re coming from Node.

package main

import (
    "sync"

    "github.com/gofiber/fiber/v2"
)

type Item struct {
    ID    int     `json:"id"`
    Name  string  `json:"name"`
    Price float64 `json:"price"`
}

type ItemCreate struct {
    Name  string  `json:"name"`
    Price float64 `json:"price"`
}

var (
    items  = make(map[int]Item)
    nextID = 1
    mu     sync.Mutex
)

func main() {
    app := fiber.New()

    app.Get("/health", func(c *fiber.Ctx) error {
        return c.JSON(fiber.Map{"status": "ok"})
    })

    app.Get("/items/:id", func(c *fiber.Ctx) error {
        id, err := c.ParamsInt("id")
        if err != nil {
            return c.Status(400).JSON(fiber.Map{
                "error": "invalid id",
            })
        }
        mu.Lock()
        item, ok := items[id]
        mu.Unlock()
        if !ok {
            return c.Status(404).JSON(fiber.Map{
                "error": "Item not found",
            })
        }
        return c.JSON(item)
    })

    app.Post("/items", func(c *fiber.Ctx) error {
        var body ItemCreate
        if err := c.BodyParser(&body); err != nil {
            return c.Status(400).JSON(fiber.Map{
                "error": "invalid json",
            })
        }
        mu.Lock()
        item := Item{nextID, body.Name, body.Price}
        items[nextID] = item
        nextID++
        mu.Unlock()
        return c.Status(201).JSON(item)
    })

    app.Listen(":3000")
}

Compare this to the stdlib version above. c.ParamsInt("id") replaces strconv.Atoi(r.PathValue("id")). c.BodyParser(&body) replaces the json.NewDecoder dance. c.Status(201).JSON(item) replaces the manual header/status/encode sequence. The sync.Mutex is still there - Fiber uses goroutines just like net/http, so you still need to handle concurrent access yourself.

What’s good: Express-like API in Go. Less boilerplate than stdlib. fiber.Map saves you from map[string]string everywhere. Fast - built on fasthttp.

What’s less good: Same validation gap as stdlib - you’d add go-playground/validator in production. The fasthttp dependency means some net/http middleware won’t work directly. Same if err != nil pattern.

Comparison

FrameworkLinesDependenciesValidationType safetyAuto docsConcurrencyDeploy
FastAPI30uvicorn, pydanticpydantic (built-in)runtimeyesGIL helpsdocker
Express38express, zodzodruntime (zod)nosingle-threadeddocker
Hono30hono, zodzodruntime (zod)nosingle-threadeddocker
Fastify40fastifyJSON Schema (built-in)runtime (schema)pluginsingle-threadeddocker
Lambda42zodzodruntime (zod)noper-invocationzip
Actix Web65actix-web, serdeserde (built-in)compile-timenocompiler-enforcedbinary
Axum60axum, serdeserde (built-in)compile-timenocompiler-enforcedbinary
Go stdlib62nonenonepartialnomanualbinary
Go Fiber48fibernonepartialnomanualbinary

All nine implementations work. The differences come down to what you value: validation out of the box, compile-time safety, deployment simplicity, package availability. Pick based on your style preference and performance requirements.

All nine implementations are at github.com/danieljohnmorris/simple-api-examples.