Calculus works fine without infinity. Finitism is basically a philosophical position without practical consequences. Plenty of serious people have planted their flag there. I don't find it particularly surprising that someone who works with computers, especially at a low level, would be drawn to it.
That's not a sum in the traditional sense so don't think about it this way.
Infinities are used quite often in mathematics for rather mundane things. Calculus doesn't work without it. It is also quite important to the foundation of many other areas but this is often hidden unless you get into advanced works (in this sentence we're not considering a typical undergraduate Multivariate Calculus, PDEs, or Linear Algebra as "advanced")