Creating a transformation matrix with respect to given bases? [closed]

Let’s say I have a linear transformation T:V→WT:V→WT:V\to W, along with some bases {v1,v2}{v1,v2}\{v_1,v_2\} and {w1,w2,w3}\{w_1,w_2,w_3\} of each respectively.

Let’s say all the information I have about the transformation and vector spaces is this plus the image of v1v_1 and v2v_2 under TT, and I want to model this in mathematica and get a transformation matrix with respect to these bases. What is the best way to do this?

So in particular I don’t know in what way T(v1)T(v_1) and T(v2)T(v_2) are written in terms of the basis vectors of WW yet. Is there a simple way to get this with Mathematica?

For example, given
{w1=u1+u2+u3,w2=u1−3u2,w3=4u1+3u2−u3},\{w_1=u_1+u_2+u_3,w_2=u_1-3u_2,w_3=4u_1+3u_2-u_3\},
T(v1)=u1T(v_1) = u_1, and T(v2)=u2T(v_2) = u_2, how do we make Mathematica write an arbitrary combination of v1v_1 and v2v_2 in terms of the wiw_i’s?

=================

=================

1 Answer
1

=================

Assuming that the sets of vectors {v1,v2}\{v_1,v_2\}, {w1,w2,w3}\{w_1,w_2,w_3\}, and {u1,u2,u3}\{u_1,u_2,u_3\} are bases for their respective spaces, finding the transformation matrix is straight-forward.

Given the transformation

v[1] -> u[1]
v[2] -> u[2]

where

{w[1] == u[1] + u[2] + u[3], w[2] == u[1] – 3 u[2], w[3] == 4 u[1] + 3 u[2] – u[3]}

we first construct the matrix of t with respect to the v and u bases via

(t[u, v] = SparseArray[{i_, i_} -> 1, {3, 2}] // Normal) // MatrixForm

Then, we construct the change-of basis matrix that takes us from u to w. First, solving for u in terms of w:

sols = Array[u, 3] /. First@Solve[
{w[1] == u[1] + u[2] + u[3], w[2] == u[1] – 3 u[2], w[3] == 4 u[1] + 3 u[2] – u[3]},
Array[u, 3]
] // Expand

and using this to get the basis transformation

(x[w, u] = Last@CoefficientArrays[sols, Array[w, 3]] // Normal) // MatrixForm

Finally, the matrix of t with respect to the v and w bases is

(t[w, v] = x[w, u].t[u, v]) // MatrixForm

Note, for instance, that this works correctly. T(v1)T(v_1) corresponds to

t[w, v].{1, 0}
(* {3/19, 1/19, 15/19} *)

and note that that output is exactly u[1] in the w basis.

In general, given an arbitrary vector in the space VV, we do the following:

vec = a[1] v[1] + a[2] v[2];
col = Normal@Last@CoefficientArrays[vec, Array[v, 2]]
(* {a[1], a[2]} *)

Then,

row = t[w, v].col
(* {(3 a[1])/19 + (4 a[2])/19, a[1]/19 – (5 a[2])/19, (15 a[1])/19 + a[2]/19} *)

and finally,

Apply[Plus, Array[w, 3] row]

  

 

@mathers101. I’ve done things this way because I’m not sure exactly what format you want things in. Do you just want the matrix of TT? Do you want something like that last result? In which case we can short-circuit the matrix formulation.
– march
Mar 28 at 4:27

  

 

this is really great and helpful how it is. Thanks.
– Alex Mathers
Mar 28 at 17:49