Skip to main content
Dude LemonDude Lemon
WorkAboutBlogCareers
LoginLet's Talk
Home/Blog/How to Build a REST API With Node.js, Express, and PostgreSQL
Backend

How to Build a REST API With Node.js, Express, and PostgreSQL

A complete guide to building production-ready REST APIs with Node.js, Express, and PostgreSQL — from project setup to authentication, validation, error handling, and deployment.

DL
Shantanu Kumar
Chief Solutions Architect
March 12, 2026
18 min read
Updated March 2026
XinCopy

Building a REST API is one of the most common tasks in modern software development. Whether you are building a mobile app backend, a SaaS dashboard, or an internal tool, the API layer is the foundation everything else depends on. Get it wrong and every feature built on top becomes harder to ship, harder to debug, and harder to scale.

This guide walks through building a production-grade REST API using Node.js with Express and PostgreSQL. We are not building a toy. We are building the same architecture pattern our team uses for client projects — with proper error handling, input validation, authentication, database migrations, and a structure that scales from one endpoint to hundreds.

A production API is not just endpoints that return JSON. It is a system with predictable error behavior, validated inputs, authenticated access, and observable performance.

1) Project structure and initial setup

Start with a clear folder structure. Mixing routes, business logic, and database queries into the same files is the fastest way to create unmaintainable code. We separate concerns from day one: routes define URL patterns, controllers handle request/response logic, services contain business rules, and models interact with the database.

bashProject Structure
1project/
2├── src/
3│ ├── config/
4│ │ └── database.js # PostgreSQL connection pool
5│ ├── middleware/
6│ │ ├── auth.js # JWT authentication
7│ │ ├── validate.js # Request validation
8│ │ └── errorHandler.js # Global error handler
9│ ├── routes/
10│ │ ├── index.js # Route aggregator
11│ │ ├── users.js # User endpoints
12│ │ └── projects.js # Project endpoints
13│ ├── controllers/
14│ │ ├── userController.js
15│ │ └── projectController.js
16│ ├── services/
17│ │ ├── userService.js
18│ │ └── projectService.js
19│ ├── models/
20│ │ ├── userModel.js
21│ │ └── projectModel.js
22│ └── app.js # Express application setup
23├── migrations/
24│ └── 001_initial_schema.sql
25├── package.json
26└── server.js # Entry point

Initialize the project and install the core dependencies. We use pg for PostgreSQL, express for HTTP routing, jsonwebtoken for auth tokens, bcrypt for password hashing, and joi for input validation.

bashTerminal
1mkdir my-api && cd my-api
2npm init -y
3npm install express pg jsonwebtoken bcrypt joi helmet cors dotenv
4npm install -D nodemon

2) Database connection with connection pooling

Never create a new database connection per request. Connection pooling reuses a fixed number of connections, which eliminates connection overhead and prevents exhausting your database under load. The pg library provides a built-in Pool class that handles this automatically.

javascriptsrc/config/database.js
1import pg from 'pg'
2
3const pool = new pg.Pool({
4 host: process.env.DB_HOST,
5 port: parseInt(process.env.DB_PORT, 10) || 5432,
6 database: process.env.DB_NAME,
7 user: process.env.DB_USER,
8 password: process.env.DB_PASSWORD,
9 max: 20, // Maximum connections in pool
10 idleTimeoutMillis: 30000,
11 connectionTimeoutMillis: 5000,
12 ssl: process.env.NODE_ENV === 'production'
13 ? { rejectUnauthorized: false }
14 : false,
15})
16
17pool.on('error', (err) => {
18 console.error('Unexpected database pool error:', err)
19 process.exit(1)
20})
21
22export async function query(text, params) {
23 const start = Date.now()
24 const result = await pool.query(text, params)
25 const duration = Date.now() - start
26
27 if (duration > 200) {
28 console.warn('Slow query detected:', {
29 text: text.substring(0, 80),
30 duration: `${duration}ms`,
31 rows: result.rowCount,
32 })
33 }
34
35 return result
36}
37
38export async function getClient() {
39 const client = await pool.connect()
40 const originalRelease = client.release.bind(client)
41
42 client.release = () => {
43 client.release = originalRelease
44 return originalRelease()
45 }
46
47 return client
48}
49
50export default pool

Notice the slow query warning. In production, you want visibility into database performance from day one. A 200ms threshold catches most problem queries before they cause user-facing latency. This single pattern has saved us hours of debugging on client projects.

3) Database schema and migrations

Define your schema in SQL migration files, not in application code. Migration files are version-controlled, reviewable, and reproducible across environments. Every team member and every deployment gets the exact same database structure.

sqlmigrations/001_initial_schema.sql
1CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
2
3CREATE TABLE users (
4 id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
5 email VARCHAR(255) UNIQUE NOT NULL,
6 password VARCHAR(255) NOT NULL,
7 full_name VARCHAR(100) NOT NULL,
8 role VARCHAR(20) DEFAULT 'user' CHECK (role IN ('user', 'admin')),
9 created_at TIMESTAMPTZ DEFAULT NOW(),
10 updated_at TIMESTAMPTZ DEFAULT NOW()
11);
12
13CREATE TABLE projects (
14 id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
15 user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
16 name VARCHAR(200) NOT NULL,
17 description TEXT,
18 status VARCHAR(20) DEFAULT 'active'
19 CHECK (status IN ('active', 'completed', 'archived')),
20 budget NUMERIC(12,2),
21 deadline DATE,
22 created_at TIMESTAMPTZ DEFAULT NOW(),
23 updated_at TIMESTAMPTZ DEFAULT NOW()
24);
25
26CREATE INDEX idx_projects_user_id ON projects(user_id);
27CREATE INDEX idx_projects_status ON projects(status);
28CREATE INDEX idx_users_email ON users(email);
29
30-- Auto-update updated_at timestamp
31CREATE OR REPLACE FUNCTION update_timestamp()
32RETURNS TRIGGER AS $$
33BEGIN
34 NEW.updated_at = NOW();
35 RETURN NEW;
36END;
37$$ LANGUAGE plpgsql;
38
39CREATE TRIGGER set_users_updated
40 BEFORE UPDATE ON users
41 FOR EACH ROW EXECUTE FUNCTION update_timestamp();
42
43CREATE TRIGGER set_projects_updated
44 BEFORE UPDATE ON projects
45 FOR EACH ROW EXECUTE FUNCTION update_timestamp();

UUIDs as primary keys prevent enumeration attacks and make multi-database merges easier. The CHECK constraints on status and role enforce data integrity at the database level, which is stronger than application-level validation alone. The trigger function automatically maintains updated_at timestamps without requiring application code to remember to set them.

4) Express application setup with security middleware

Configure Express with production security defaults from the start. Helmet sets secure HTTP headers, CORS controls cross-origin access, and a JSON body parser with size limits prevents payload abuse. Adding these after the fact is how security vulnerabilities ship to production.

javascriptsrc/app.js
1import express from 'express'
2import helmet from 'helmet'
3import cors from 'cors'
4import routes from './routes/index.js'
5import { errorHandler } from './middleware/errorHandler.js'
6
7const app = express()
8
9// Security headers
10app.use(helmet())
11
12// CORS configuration
13app.use(cors({
14 origin: process.env.ALLOWED_ORIGINS?.split(',') || '*',
15 methods: ['GET', 'POST', 'PUT', 'PATCH', 'DELETE'],
16 allowedHeaders: ['Content-Type', 'Authorization'],
17 credentials: true,
18}))
19
20// Body parsing with size limits
21app.use(express.json({ limit: '10kb' }))
22app.use(express.urlencoded({ extended: false }))
23
24// Request logging
25app.use((req, res, next) => {
26 const start = Date.now()
27 res.on('finish', () => {
28 const duration = Date.now() - start
29 if (duration > 1000) {
30 console.warn(`Slow request: ${req.method} ${req.path} ${duration}ms`)
31 }
32 })
33 next()
34})
35
36// Health check
37app.get('/health', (req, res) => {
38 res.json({ status: 'ok', uptime: process.uptime() })
39})
40
41// API routes
42app.use('/api', routes)
43
44// 404 handler
45app.use((req, res) => {
46 res.status(404).json({ error: 'Endpoint not found' })
47})
48
49// Global error handler (must be last)
50app.use(errorHandler)
51
52export default app

5) Input validation with Joi

Never trust client input. Every request body, query parameter, and URL parameter should be validated before it reaches your business logic. Joi provides schema-based validation that is expressive, composable, and produces clear error messages. We create a reusable middleware that wraps any Joi schema.

javascriptsrc/middleware/validate.js
1export function validate(schema, source = 'body') {
2 return (req, res, next) => {
3 const { error, value } = schema.validate(req[source], {
4 abortEarly: false,
5 stripUnknown: true,
6 })
7
8 if (error) {
9 const messages = error.details.map((d) => d.message)
10 return res.status(400).json({
11 error: 'Validation failed',
12 details: messages,
13 })
14 }
15
16 req[source] = value
17 next()
18 }
19}
javascriptsrc/validators/userSchemas.js
1import Joi from 'joi'
2
3export const registerSchema = Joi.object({
4 email: Joi.string().email().required().lowercase().trim(),
5 password: Joi.string().min(8).max(128).required(),
6 full_name: Joi.string().min(1).max(100).required().trim(),
7})
8
9export const loginSchema = Joi.object({
10 email: Joi.string().email().required().lowercase().trim(),
11 password: Joi.string().required(),
12})
13
14export const updateUserSchema = Joi.object({
15 full_name: Joi.string().min(1).max(100).trim(),
16 email: Joi.string().email().lowercase().trim(),
17}).min(1)
18
19export const projectSchema = Joi.object({
20 name: Joi.string().min(1).max(200).required().trim(),
21 description: Joi.string().max(5000).allow('').trim(),
22 status: Joi.string().valid('active', 'completed', 'archived'),
23 budget: Joi.number().positive().precision(2).max(999999999999),
24 deadline: Joi.date().iso().greater('now'),
25})

The stripUnknown option silently removes fields that are not in the schema, preventing unexpected data from reaching your database. Setting abortEarly to false collects all validation errors at once instead of returning on the first failure, which gives API consumers a much better developer experience.

6) JWT authentication middleware

Authentication middleware verifies the JWT token on every protected request and attaches the decoded user payload to the request object. This keeps auth logic in one place instead of duplicated across controllers.

javascriptsrc/middleware/auth.js
1import jwt from 'jsonwebtoken'
2
3const JWT_SECRET = process.env.JWT_SECRET
4
5export function authenticate(req, res, next) {
6 const header = req.headers.authorization
7 if (!header?.startsWith('Bearer ')) {
8 return res.status(401).json({ error: 'Authentication required' })
9 }
10
11 const token = header.slice(7)
12
13 try {
14 const decoded = jwt.verify(token, JWT_SECRET)
15 req.user = decoded
16 next()
17 } catch (err) {
18 if (err.name === 'TokenExpiredError') {
19 return res.status(401).json({ error: 'Token expired' })
20 }
21 return res.status(401).json({ error: 'Invalid token' })
22 }
23}
24
25export function authorize(...roles) {
26 return (req, res, next) => {
27 if (!roles.includes(req.user.role)) {
28 return res.status(403).json({ error: 'Insufficient permissions' })
29 }
30 next()
31 }
32}
33
34export function generateToken(user) {
35 return jwt.sign(
36 { id: user.id, email: user.email, role: user.role },
37 JWT_SECRET,
38 { expiresIn: '24h' }
39 )
40}

The authorize function is a higher-order middleware that checks user roles. You can chain it after authenticate to protect admin-only endpoints. This pattern — authenticate then authorize — is how enterprise APIs enforce least-privilege access without cluttering controller logic.

7) Model layer — clean database queries

Models encapsulate all SQL queries. Controllers never write raw SQL. This separation means you can swap databases, optimize queries, or add caching without touching your route logic. Each model function takes parameters and returns plain JavaScript objects.

javascriptsrc/models/userModel.js
1import { query } from '../config/database.js'
2import bcrypt from 'bcrypt'
3
4const SALT_ROUNDS = 12
5
6export async function createUser({ email, password, full_name }) {
7 const hashedPassword = await bcrypt.hash(password, SALT_ROUNDS)
8
9 const result = await query(
10 `INSERT INTO users (email, password, full_name)
11 VALUES ($1, $2, $3)
12 RETURNING id, email, full_name, role, created_at`,
13 [email, hashedPassword, full_name]
14 )
15
16 return result.rows[0]
17}
18
19export async function findByEmail(email) {
20 const result = await query(
21 'SELECT * FROM users WHERE email = $1',
22 [email]
23 )
24 return result.rows[0] || null
25}
26
27export async function findById(id) {
28 const result = await query(
29 `SELECT id, email, full_name, role, created_at, updated_at
30 FROM users WHERE id = $1`,
31 [id]
32 )
33 return result.rows[0] || null
34}
35
36export async function verifyPassword(plainText, hashed) {
37 return bcrypt.compare(plainText, hashed)
38}
javascriptsrc/models/projectModel.js
1import { query, getClient } from '../config/database.js'
2
3export async function findByUserId(userId, { page = 1, limit = 20, status }) {
4 const offset = (page - 1) * limit
5 const conditions = ['user_id = $1']
6 const params = [userId]
7
8 if (status) {
9 conditions.push(`status = $${params.length + 1}`)
10 params.push(status)
11 }
12
13 const where = conditions.join(' AND ')
14
15 const [dataResult, countResult] = await Promise.all([
16 query(
17 `SELECT * FROM projects
18 WHERE ${where}
19 ORDER BY created_at DESC
20 LIMIT $${params.length + 1} OFFSET $${params.length + 2}`,
21 [...params, limit, offset]
22 ),
23 query(
24 `SELECT COUNT(*) FROM projects WHERE ${where}`,
25 params
26 ),
27 ])
28
29 return {
30 data: dataResult.rows,
31 total: parseInt(countResult.rows[0].count, 10),
32 page,
33 limit,
34 totalPages: Math.ceil(countResult.rows[0].count / limit),
35 }
36}
37
38export async function create(userId, projectData) {
39 const result = await query(
40 `INSERT INTO projects (user_id, name, description, status, budget, deadline)
41 VALUES ($1, $2, $3, $4, $5, $6)
42 RETURNING *`,
43 [userId, projectData.name, projectData.description,
44 projectData.status || 'active', projectData.budget, projectData.deadline]
45 )
46 return result.rows[0]
47}
48
49export async function update(projectId, userId, updates) {
50 const fields = Object.keys(updates)
51 const setClause = fields
52 .map((f, i) => `${f} = $${i + 3}`)
53 .join(', ')
54
55 const result = await query(
56 `UPDATE projects SET ${setClause}
57 WHERE id = $1 AND user_id = $2
58 RETURNING *`,
59 [projectId, userId, ...Object.values(updates)]
60 )
61 return result.rows[0] || null
62}
63
64export async function remove(projectId, userId) {
65 const result = await query(
66 'DELETE FROM projects WHERE id = $1 AND user_id = $2 RETURNING id',
67 [projectId, userId]
68 )
69 return result.rowCount > 0
70}

8) Controllers — request and response handling

Controllers are thin. They extract data from the request, call service or model functions, and format the response. No business logic lives here. This makes controllers easy to test and easy to read during code review.

javascriptsrc/controllers/userController.js
1import * as userModel from '../models/userModel.js'
2import { generateToken } from '../middleware/auth.js'
3
4export async function register(req, res, next) {
5 try {
6 const existing = await userModel.findByEmail(req.body.email)
7 if (existing) {
8 return res.status(409).json({ error: 'Email already registered' })
9 }
10
11 const user = await userModel.createUser(req.body)
12 const token = generateToken(user)
13
14 res.status(201).json({ user, token })
15 } catch (err) {
16 next(err)
17 }
18}
19
20export async function login(req, res, next) {
21 try {
22 const user = await userModel.findByEmail(req.body.email)
23 if (!user) {
24 return res.status(401).json({ error: 'Invalid credentials' })
25 }
26
27 const valid = await userModel.verifyPassword(
28 req.body.password,
29 user.password
30 )
31 if (!valid) {
32 return res.status(401).json({ error: 'Invalid credentials' })
33 }
34
35 const token = generateToken(user)
36 const { password, ...safeUser } = user
37 res.json({ user: safeUser, token })
38 } catch (err) {
39 next(err)
40 }
41}
42
43export async function getProfile(req, res, next) {
44 try {
45 const user = await userModel.findById(req.user.id)
46 if (!user) {
47 return res.status(404).json({ error: 'User not found' })
48 }
49 res.json({ user })
50 } catch (err) {
51 next(err)
52 }
53}
javascriptsrc/controllers/projectController.js
1import * as projectModel from '../models/projectModel.js'
2
3export async function list(req, res, next) {
4 try {
5 const { page, limit, status } = req.query
6 const result = await projectModel.findByUserId(req.user.id, {
7 page: parseInt(page, 10) || 1,
8 limit: Math.min(parseInt(limit, 10) || 20, 100),
9 status,
10 })
11 res.json(result)
12 } catch (err) {
13 next(err)
14 }
15}
16
17export async function create(req, res, next) {
18 try {
19 const project = await projectModel.create(req.user.id, req.body)
20 res.status(201).json({ project })
21 } catch (err) {
22 next(err)
23 }
24}
25
26export async function update(req, res, next) {
27 try {
28 const project = await projectModel.update(
29 req.params.id,
30 req.user.id,
31 req.body
32 )
33 if (!project) {
34 return res.status(404).json({ error: 'Project not found' })
35 }
36 res.json({ project })
37 } catch (err) {
38 next(err)
39 }
40}
41
42export async function remove(req, res, next) {
43 try {
44 const deleted = await projectModel.remove(req.params.id, req.user.id)
45 if (!deleted) {
46 return res.status(404).json({ error: 'Project not found' })
47 }
48 res.status(204).end()
49 } catch (err) {
50 next(err)
51 }
52}

9) Route definitions

Routes wire HTTP methods and URL patterns to controller functions. Middleware like authentication and validation are applied at the route level, making it immediately clear which endpoints are protected and what input they expect.

javascriptsrc/routes/index.js
1import { Router } from 'express'
2import { authenticate } from '../middleware/auth.js'
3import { validate } from '../middleware/validate.js'
4import { registerSchema, loginSchema } from '../validators/userSchemas.js'
5import { projectSchema } from '../validators/userSchemas.js'
6import * as userCtrl from '../controllers/userController.js'
7import * as projectCtrl from '../controllers/projectController.js'
8
9const router = Router()
10
11// Public auth routes
12router.post('/register', validate(registerSchema), userCtrl.register)
13router.post('/login', validate(loginSchema), userCtrl.login)
14
15// Protected user routes
16router.get('/me', authenticate, userCtrl.getProfile)
17
18// Protected project routes
19router.get('/projects', authenticate, projectCtrl.list)
20router.post('/projects', authenticate, validate(projectSchema), projectCtrl.create)
21router.put('/projects/:id', authenticate, validate(projectSchema), projectCtrl.update)
22router.delete('/projects/:id', authenticate, projectCtrl.remove)
23
24export default router

10) Global error handling

A centralized error handler catches every unhandled error in the request pipeline. It logs the full error for debugging while returning a safe, consistent response to the client. In production, never expose stack traces or internal error details to API consumers.

javascriptsrc/middleware/errorHandler.js
1export function errorHandler(err, req, res, _next) {
2 console.error(`[${new Date().toISOString()}] ${req.method} ${req.path}`, {
3 error: err.message,
4 stack: err.stack,
5 body: req.body,
6 user: req.user?.id,
7 })
8
9 // PostgreSQL unique violation
10 if (err.code === '23505') {
11 return res.status(409).json({
12 error: 'Resource already exists',
13 })
14 }
15
16 // PostgreSQL foreign key violation
17 if (err.code === '23503') {
18 return res.status(400).json({
19 error: 'Referenced resource not found',
20 })
21 }
22
23 const status = err.statusCode || 500
24 res.status(status).json({
25 error: status === 500 ? 'Internal server error' : err.message,
26 })
27}

Catching PostgreSQL-specific error codes like 23505 (unique violation) and 23503 (foreign key violation) lets you return meaningful HTTP status codes instead of generic 500 errors. Your API consumers will thank you when they can distinguish between a duplicate email and a server crash.

11) Server entry point and graceful shutdown

The server entry point starts the HTTP listener and handles graceful shutdown. When the process receives a termination signal, it stops accepting new connections, waits for in-flight requests to complete, closes the database pool, and exits cleanly. This prevents dropped requests during deployments.

javascriptserver.js
1import 'dotenv/config'
2import app from './src/app.js'
3import pool from './src/config/database.js'
4
5const PORT = process.env.PORT || 3000
6
7const server = app.listen(PORT, () => {
8 console.log(`API server running on port ${PORT}`)
9})
10
11async function shutdown(signal) {
12 console.log(`${signal} received. Shutting down gracefully...`)
13
14 server.close(async () => {
15 console.log('HTTP server closed')
16 await pool.end()
17 console.log('Database pool closed')
18 process.exit(0)
19 })
20
21 // Force exit after 10 seconds
22 setTimeout(() => {
23 console.error('Forced shutdown after timeout')
24 process.exit(1)
25 }, 10000)
26}
27
28process.on('SIGTERM', () => shutdown('SIGTERM'))
29process.on('SIGINT', () => shutdown('SIGINT'))

12) Testing your API with curl

Before writing automated tests, verify each endpoint manually. Curl commands are fast, repeatable, and easy to share with teammates. Here is the complete flow from registration through CRUD operations on projects.

bashAPI Testing Commands
1# Register a new user
2curl -X POST http://localhost:3000/api/register \
3 -H "Content-Type: application/json" \
4 -d '{"email":"dev@example.com","password":"securepass123","full_name":"Jane Doe"}'
5
6# Login and capture token
7TOKEN=$(curl -s -X POST http://localhost:3000/api/login \
8 -H "Content-Type: application/json" \
9 -d '{"email":"dev@example.com","password":"securepass123"}' \
10 | jq -r '.token')
11
12# Get current user profile
13curl http://localhost:3000/api/me \
14 -H "Authorization: Bearer $TOKEN"
15
16# Create a project
17curl -X POST http://localhost:3000/api/projects \
18 -H "Content-Type: application/json" \
19 -H "Authorization: Bearer $TOKEN" \
20 -d '{"name":"Client Portal","description":"Customer-facing dashboard","budget":45000}'
21
22# List projects with pagination
23curl "http://localhost:3000/api/projects?page=1&limit=10" \
24 -H "Authorization: Bearer $TOKEN"
25
26# Update a project (replace PROJECT_ID with actual UUID)
27curl -X PUT http://localhost:3000/api/projects/PROJECT_ID \
28 -H "Content-Type: application/json" \
29 -H "Authorization: Bearer $TOKEN" \
30 -d '{"status":"completed"}'
31
32# Delete a project
33curl -X DELETE http://localhost:3000/api/projects/PROJECT_ID \
34 -H "Authorization: Bearer $TOKEN"

13) Rate limiting and production hardening

Before deploying, add rate limiting to prevent abuse. A simple in-memory rate limiter works for single-server deployments. For multi-server setups, use Redis-backed rate limiting. Also add request ID tracking for correlating logs across services.

javascriptsrc/middleware/rateLimit.js
1import { randomUUID } from 'crypto'
2
3const windowMs = 15 * 60 * 1000 // 15 minutes
4const maxRequests = 100
5const store = new Map()
6
7export function rateLimit(req, res, next) {
8 const key = req.ip
9 const now = Date.now()
10 const record = store.get(key) || { count: 0, resetAt: now + windowMs }
11
12 if (now > record.resetAt) {
13 record.count = 0
14 record.resetAt = now + windowMs
15 }
16
17 record.count++
18 store.set(key, record)
19
20 res.setHeader('X-RateLimit-Limit', maxRequests)
21 res.setHeader('X-RateLimit-Remaining', Math.max(0, maxRequests - record.count))
22
23 if (record.count > maxRequests) {
24 return res.status(429).json({ error: 'Too many requests' })
25 }
26
27 next()
28}
29
30export function requestId(req, res, next) {
31 req.id = req.headers['x-request-id'] || randomUUID()
32 res.setHeader('X-Request-Id', req.id)
33 next()
34}

Architecture decisions that matter

  • Parameterized queries everywhere — never concatenate user input into SQL strings. The pg library handles parameterization with $1, $2 placeholders.
  • Passwords are hashed with bcrypt at 12 salt rounds. Never store plaintext passwords, and never use fast hashing algorithms like MD5 or SHA for passwords.
  • JWT tokens carry minimal claims (id, email, role). Do not put sensitive data in tokens because they are base64-encoded, not encrypted.
  • Database constraints (UNIQUE, CHECK, FOREIGN KEY) enforce data integrity even if application code has bugs.
  • The error handler catches PostgreSQL error codes and returns appropriate HTTP status codes instead of generic 500s.
  • Connection pooling with pg.Pool prevents connection exhaustion under concurrent load.
  • Graceful shutdown ensures zero dropped requests during deployments.

Common mistakes to avoid

After building dozens of production APIs for clients, these are the mistakes we see most often. Each one seems minor in development but causes real problems at scale.

  • Returning password hashes in API responses — always destructure or exclude password fields before sending user data.
  • Missing pagination — returning all rows works with 10 records but crashes with 10,000.
  • No input size limits — without body parser limits, a single request can exhaust server memory.
  • Catching errors silently — swallowing errors without logging makes debugging in production nearly impossible.
  • Hardcoding configuration — database credentials, JWT secrets, and API keys belong in environment variables, never in source code.
  • Skipping database indexes — every column used in WHERE clauses or JOIN conditions needs an index. The performance difference is orders of magnitude.

Next steps

This guide covers the foundation. From here, consider adding automated tests with a test database, request logging with structured JSON output, API documentation with OpenAPI/Swagger, file upload handling with multer, WebSocket support for real-time features, and containerization with Docker for consistent deployments.

The architecture pattern in this guide is the same one we use at Dude Lemon for client API projects. It scales from a weekend prototype to a production system serving thousands of users. If you are building an API for your business and want experienced engineers to review your architecture or handle the build, reach out to our team for a free consultation.

A well-structured API is not over-engineered — it is an investment that pays for itself the first time you need to debug a production issue at 2 AM.

Need help building this?

Let our team build it for you.

Dude Lemon builds production-grade web apps, APIs, and cloud infrastructure. Get a free consultation and project proposal within 48 hours.

Start a Project
← PreviousWeb Application Development: A Complete Guide for Businesses in 2026Engineering
Next →Deploy a Node.js App to AWS EC2 With PM2, Nginx, and SSLDevOps

In This Article

1) Project structure and initial setup2) Database connection with connection pooling3) Database schema and migrations4) Express application setup with security middleware5) Input validation with Joi6) JWT authentication middleware7) Model layer — clean database queries8) Controllers — request and response handling9) Route definitions10) Global error handling11) Server entry point and graceful shutdown12) Testing your API with curl13) Rate limiting and production hardeningArchitecture decisions that matterCommon mistakes to avoidNext steps
Need help building this?
Dude LemonDude Lemon

Custom software development.
Built right. Shipped fast.

Start a project
Pages
HomeWorkAboutBlogCareers
Services
Custom Web App DevelopmentMobile App DevelopmentCloud Infrastructure & AI
Connect
[email protected]Schedule Intro CallContact
© 2026 Dude Lemon LLC · Los Angeles, CA
PrivacyTerms