CS Essentials Deep Dive · 1 of 4

Stack & Heap — Where Your Data Lives

Every running program splits its memory into two big regions. The stack is fast, ordered, and tied to the call you're inside right now. The heap is for things that need to outlive the function that made them. Most "weird" bugs — leaks, dangling pointers, surprise nulls — are really arguments about which region a value belongs in.

MemoryLifetimesGCPointersPerformance
← Back to Foundations
Quick Facts

At a Glance

Basic Concepts

  • Stack: a per-thread region that grows and shrinks with function calls. Allocations are a single pointer bump — effectively free.
  • Heap: a process-wide region for objects whose size or lifetime isn't known at compile time. Allocations involve a manager (malloc, GC, allocator).
  • Stack frame: the block pushed on every function call — parameters, locals, return address. Popped on return.
  • Reference vs value: a stack variable can hold the value directly, or a reference (pointer) to something on the heap.
  • Lifetime: stack values die when their function returns. Heap values live until something — GC, free, or the end of the program — releases them.
The Picture

What's Actually In RAM

Top
Stack

Grows downward. One per thread. Small (typically 1–8 MB). Recursion-bombs blow it up — that's a StackOverflow.

Middle
Heap

Grows upward. Shared across threads. Effectively unlimited (until the OS says no). Where new, malloc, and most objects live.

Static
BSS / Data / Code

Globals, string literals, the program's instructions. Loaded once, never resized.

OS
Mapped Regions

Memory-mapped files, shared libraries, kernel pages. Visible to your process but managed by the OS.

Languages with managed runtimes (JVM, .NET, Go, JS) add their own substructure on top — generations, eden space, large-object heap — but the stack/heap split is the foundation underneath all of it.

Stack

What Lives on the Stack

Function Frames

Every call pushes a frame: arguments, local variables, the return address, and saved registers. Return pops it. This is why deep recursion crashes — there's a hard ceiling, and unlike the heap it doesn't grow on demand.

Value Types & Small Locals

Integers, floats, booleans, fixed-size structs. In C++/Rust/C#/Go, structs default to stack allocation when the compiler can prove they don't escape the function. In Java/Python/JS, primitive locals sit on the stack but objects always go on the heap (with escape-analysis exceptions on the JVM).

Why It's Fast

Allocation is "subtract from the stack pointer." Deallocation is "add back." No bookkeeping, no fragmentation, no GC. Cache-friendly: the top of the stack is almost certainly hot in L1.

Heap

What Lives on the Heap

Things With Unknown or Long Lifetime

A value returned from a function, an object held by a long-lived service, a cache entry, a socket buffer. Anything whose lifetime crosses a function boundary has to live somewhere stable — that's the heap.

Things With Unknown Size

A list whose length is read from a file, a string built at runtime, a tree of arbitrary depth. The compiler can't reserve a fixed slot, so it asks the allocator for a block.

How Allocators Work
  • Free lists / size classes: jemalloc, tcmalloc, mimalloc keep buckets of pre-sized chunks to avoid fragmentation.
  • Bump allocators / arenas: allocate forward, free everything at once. Common in compilers and per-request lifetimes.
  • GC heaps: generational (young / old), often compacting — moves live objects together to keep allocation a pointer bump.
Languages

How the Big Runtimes Handle It

LanguageMemory ModelWho Frees
C / C++Manual. Stack for locals, new/malloc for heap. RAII (C++) ties heap lifetime to stack scope via destructors.You.
RustOwnership + borrow checker. Heap via Box, Vec, Rc. Compiler proves lifetimes; no GC.Compiler-inserted drops.
Java / C# / KotlinAlmost everything is on the heap. Generational GC, escape analysis can stack-allocate hot objects.Garbage collector.
GoStack by default, escape analysis promotes to heap. Concurrent low-latency GC.Garbage collector.
Python / JS / RubyObjects on the heap. Reference counting (CPython) or tracing GC (V8, MRI's mark-sweep).Runtime.
Swift / Obj-CAutomatic Reference Counting (ARC) — compiler inserts retain/release.ARC + you (cycles).
Bugs

Where This Bites You

  • Stack overflow: unbounded recursion or huge local arrays. Fix the recursion (iterative / trampoline) — don't just raise the stack size.
  • Dangling pointer / use-after-free: returning a reference to a stack local, or freeing heap memory still in use. The classic source of CVEs in C/C++.
  • Memory leaks: heap allocations the program never releases. In GC'd languages, usually a long-lived collection holding short-lived objects (a cache that never evicts, an event listener never unsubscribed).
  • Reference cycles: A holds B, B holds A. Reference counting can't free them — needs a cycle collector or weak references.
  • Fragmentation: heap full of holes too small to use. Long-running C/C++ services watch this; modern allocators mitigate but don't eliminate it.
  • False sharing & cache misses: heap objects touched by different threads on the same cache line tank performance. Pad or align hot structures.
Performance

Practical Rules of Thumb

Allocate Less

The cheapest heap allocation is the one you skip. Pool buffers, reuse slices, prefer arrays of structs over arrays of pointers. Profile allocations the way you profile CPU — they're often the hidden cost in "GC pause" stories.

Keep Hot Data Contiguous

An array of 1M ints walks the cache linearly. A linked list of 1M ints chases a pointer per element and stalls the CPU. Same Big-O, different real-world performance by 10–100×.

Know Your GC

JVM, Go, .NET, V8 all have tunables and tradeoffs (throughput vs latency, generational vs region-based). For latency-sensitive services, learn what your GC does on a major collection — then either reduce allocation, switch collector, or accept the pause.

Continue

More CS Essentials