I like Serena:Hey reinout-g!
Let's see...
Suppose we define s = b' inv(A) b, which is a scalar.
It's important that it is a scalar, because it means we can reorder any multiplications of s with a matrix or a vector.
Then the question is whether: inv( A + b b' ) = inv(A) - inv(A) b b' inv(A) / (1 + s).
If we multiply both sides with (A + b b'), which is presumably invertable, we get:
I = (inv(A) - inv(A) b b' inv(A) / (1 + s)) (A + b b')
Working it out, we get:
I = (inv(A) - inv(A) b b' inv(A) / (1 + s)) (A + b b')
= I + inv(A)b b' - inv(A) b b' / (1 + s) - inv(A) b b' inv(A) b b' / (1 + s)
= I + (inv(A)b b' (1 + s) - inv(A) b b' - inv(A) b s b') / (1 + s)
= I + (inv(A)b b' + s inv(A)b b' - inv(A) b b' - s inv(A) b b') / (1 + s)
= I
(Sorry if it looks a bit cumbersome. )
This reasoning can be applied in reverse... so I think we got it!
Yay!
reinout-g:Wow Serena!
I can't imagine how long it must have took you to get that into 'inv(A)' notation!
Let alone the proof itself.
I finally deciphered it and do understand what you did there.
The idea of taking s for b' inv(A) b really helped met out.
Also I appreciated how you chose to multiply with (A + b b') since it makes it way easier to see where you're going in the proof.
Thanks a lot!