A little further elaboration on what you're looking for would be appreciated; as you've asked it, this could be anything from an elementary probability exercise to what I think might be an open problem!
So if we fix k and take n large, then the answer depends entirely on the size of M compared with n. The term of art is "threshold function"; see here for more details. Specifically, for fixed k, if $M >> n^{2-2/(k-1)}$ then almost every graph with n vertices and M edges has a clique of size k; if $M << n^{2-2/(k-1)}$ then almost no graph with those parameters has a clique of size k. This can be proved by a straightforward application of linearity of expectation and bounding the variance (Chebyshev's inequality), using the fact that every subgraph of $K_k$ has a smaller ratio of edges to vertices than does $K_k$ itself. (See Ch. 4 of Alon and Spencer for more details.) When $M$ has the same growth rate as $n^{2-2/(k-1)}$ the analysis is considerably subtler.
If we fix $M/n^2$ to be some constant (say around 1/4), then the size of the largest clique grows like log n, and in fact as n goes to infinity (and taking $M/binom{n}{2} = 1/2$) the clique number becomes concentrated at two points! This should also not be too difficult, but it's in a chapter of Alon-Spencer (Ch. 10) I haven't actually read yet, so I'm just glancing at the proof -- it doesn't look too bad. If you want more details I can try to read it more closely.
In general, the first approach I'd look at would be to estimate the expected number of k-cliques in a random graph and then bound the variance, but this might work less well depending on what exactly you're trying to do.
No comments:
Post a Comment