I'm attempting to come up with a first order language $L$ that is able to describe vector spaces over fields. I came up with a few sets of nonlogical symbols.
$Rs_L=\{Scal,Vec\}$ where $Scal,Vec$ are unary relation symbols which I interpret to be true or false depending on whether $x$ is a scalar or vector.
Also, $Fs_L=\{+,\cdot,\}$ which are the usual addition and multiplication for both vectors and scalars.
For constant symbols, $Cs_L=\{0,1,0_V\}$, where $0$ and $1$ are regular field scalars for the additive and multiplicative identities, respectively, and $0_V$ is the additive identity for vectors.
Using these, I was able to formulate axioms for scalar multiplication on a given structure, and I define addition between a scalar and a vector to simply always give the 0 vector and define multiplication between two vectors to always be the 0 vector. This gives a theory such that a structure models it if and only if it is a vector space in the regular algebraic sense. It is easy to see that all linear transformations in the algebraic sense are included in the class of homomorphisms (in the logical sense), but I am having difficulty proving they are the only ones in this class.
I suppose I want to show that for any $L$-homomorphism $T$, $T(av+w)=aTv+Tw$ for $a$ a scalar and $v,w$ vectors. Since $T$ is a homomorphism, I see that $T(av+w)=Tav+Tw=TaTv+Tw$, so it would be enough to show that all homomorphisms act as the identity on all scalars? Is there a way to show this, or an extra axiom to include?