SRGB

sRGB is one of those things everyone sees but rarely thinks about. Most people assume color is just “color,” but digital displays don’t naturally agree on how to show it. Back in the early 1990s, every monitor rendered brightness differently, printers didn’t match screens, and the early web had no unified way to display a simple blue or red. So in 1996, HP and Microsoft created sRGB as a universal color standard that any device could follow.

The cool twist is that sRGB’s brightness curve wasn’t invented from scratch—it was modeled after the behavior of old CRT monitors. Those bulky displays happened to output light in a way that matched human perception surprisingly well. Our eyes aren’t linear: we’re much more sensitive to changes in dark areas than in bright ones. If we showed linear, “real” light values directly on a screen, shadows would look crushed and highlights would look harsh. So sRGB uses a gamma curve to make brightness feel natural and comfortable for human vision.

Today, sRGB quietly runs almost the entire digital world. Most images, UI elements, game textures, and websites are created in sRGB. And while modern rendering engines perform lighting calculations in linear space for physical accuracy, they still convert the final result back to sRGB so it looks right to us. In short: linear space is for math, sRGB is for eyes—and this 1996 standard remains the backbone of how digital color is displayed everywhere.