### Putnam 1992

**Problem B5**

Let A_{n} denote the n-1 x n-1 matrix (a_{ij}) with a_{ij} = i + 2 for i = j, and 1 otherwise. Is the sequence (det A_{n})/n! bounded?

**Solution**

Answer: no.

Subtract col 1 from each of the others. This reduces each element along the diagonal after the first by 1 (so that the diagonal becomes 3, 3, 4, ... , n). The first row becomes -2 apart from the first element, the first column remains 1 apart from the first element and all other elements become 0. We now subtract 1/3 of col 2 from col 1, 1/4 of col 3 from col 1, 1/5 of col 4 from col 1 and so on. This zeros all elements of col 1 except the first which becomes 3 + 2(1/3 + 1/4 + ... + 1/n). We may now expand by the first column to get n!/2 times 3 + 2(1/3 + 1/4 + ... + 1/n). So (det A_{n})/n! = ( 3 + 2(1/3 + 1/4 + ... + 1/n) )/2. But this diverges as n tends to infinity.

Putnam 1992

© John Scholes

jscholes@kalva.demon.co.uk

6 Jan 2001