I wonder if I can proof that the square matrix with entries aij=xixja_{ij} = x_ix_j for a given vector xx is positive semidefinite.

I hope it is because this matrix is somehow related to this question where I asked if mean square error is convex function in linear regression. I actually calculated the Hessian matrix and obtained the one I give you (multiplied by 2). The problem is that I don’t know how to proof that it is positive semidefinite so that I can show that mean square error is convex. Please give a proof or a counterexample.

=================

What do you think the sign of ∑i,jaijuiuj\sum\limits_{i,j}a_{ij}u_iu_j could be, for every (ui)(u_i)?

– Did

Oct 20 at 17:22

=================

2 Answers

2

=================

You can easily check that A=xxTA=xx^T. Then, for any yy,

yTAy=yTxxTy=(xTy)TxTy≥0.

y^TAy=y^Txx^Ty=(x^Ty)^Tx^Ty\geq0.

Or prove that the eigenvalues are ‖x‖2\|x\|^2 (Euclidean norm) and 00. 😉

– egreg

Oct 20 at 17:37

Yes, after first proving that AA is symmetric 🙂

– Martin Argerami

Oct 20 at 17:40

If x=(x1,x2,…,xn)x=(x_1,x_2,\dots,x_n) is the row vector, then A=xTxA=x^Tx. So give a column vector yy, you have that yTAy=(yTxT)(xy)=(xy)T(xy)y^TAy=(y^Tx^T)(xy)=(xy)^T(xy). What is that never negative?