This is an attempt to extend the current full fledged random matrix theory to fields of positive characteristics. So here is a possible setup for the problem: Let $A_{n,p}$ be an $n times n$ matrix with entries iid taking values uniformly in $F_p$. Then one should be able to find its eigenvalues together with multiplicities, which might lie in some finite extension of the field $F_p$. To ensure diagonalizability, one might even take $A_{n,p}$ to be symmetric or antisymmetric (I am not so sure if that guarantees diagonalizability in $F_p$ but I have no counterexamples either). Now the question is if we associate to each eigenvalue $lambda$ the degree of its minimal polynomial $d(lambda)$, then does the distribution of $d(lambda)$ as $n$ goes to infinite converge to some law upon normalization (say maybe Gaussian)? I am very curious whether others have studied this problem before. Maybe it's completely trivial.
No comments:
Post a Comment