+0

# (Help Urgently Needed) Linear Algebra Problem

-4
46
1

I used cofactor expansion along the first column to realize that f is a polynomial of degree n-1. The next step is to show that 1, 2, 3... n-1 are roots of f, but I'm not sure how to do this.

I've attached an image containing the problem statement.

Apr 8, 2020

#1
+1

Expanding down the first column shows that the expansion is a polynomial of degree n - 1.

If, in a determinant,the elements of a single row are added to the corresponding elements of another row, then the value of the determinant is unaltered. The same goes for columns, adding the elements of any column to the corresponding elements of any other column leaves the value of the determinant unaltered. The same is also true for subtraction. Subtracting any row from any other row or any column from any other column leaves the value of the determinant unaltered.  (Any of these operations can be carried out multiple times.)

So, in your example, put x = 1 and all of the elements in the first column will equal 1. Subtract this first column from the second column, (which has all of its elements equal to 1), and you have a column (the second column) consisting entirely of zeros. On expansion then, the value of the determinant will be zero, implying that (x - 1) is a factor of the polynomial.

Next, put x = 2 and the first column will be identical with the third column, so on subtraction there will be again a complete column of zeros. That impies that (x - 2) is a factor of the polynomial.

Repeat for x = 3, 4, ... , n - 1.

The coefficient of the leading term is unlikely to be 1, so the product of the factors will be multiplied by some constant C.

Apr 8, 2020