This is an attempt to extend the current full fledged random matrix theory to fields of positive characteristics. So here is a possible setup for the problem: Let be an matrix with entries iid taking values uniformly in . Then one should be able to find its eigenvalues together with multiplicities, which might lie in some finite extension of the field . To ensure diagonalizability, one might even take to be symmetric or antisymmetric (I am not so sure if that guarantees diagonalizability in but I have no counterexamples either). Now the question is if we associate to each eigenvalue the degree of its minimal polynomial , then does the distribution of as goes to infinite converge to some law upon normalization (say maybe Gaussian)? I am very curious whether others have studied this problem before. Maybe it's completely trivial.
No comments:
Post a Comment