Language models can serve as universal regressors for precise numerical predictions across diverse experimental data, outperforming traditional regression models through text-based representations.