FUNC1 — strengthenings, variants, potential counterexamples

January 29, 2016

After my tentative Polymath proposal, there definitely seems to be enough momentum to start a discussion “officially”, so let’s see where it goes. I’ve thought about the question of whether to call it Polymath11 (the first unclaimed number) or Polymath12 (regarding the polynomial-identities project as Polymath11). In the end I’ve gone for Polymath11, since the polynomial-identities project was listed on the Polymath blog as a proposal, and I think the right way of looking at things is that the problem got solved before the proposal became a fully-fledged project. But I still think that that project should be counted as a Polymathematical success story: it shows the potential benefits of opening up a problem for consideration by anybody who might be interested.

Something I like to think about with Polymath projects is the following question: if we end up not solving the problem, then what can we hope to achieve? The Erdős discrepancy problem project is a good example here. An obvious answer is that we can hope that enough people have been stimulated in enough ways that the probability of somebody solving the problem in the not too distant future increases (for example because we have identified more clearly the gap in our understanding). But I was thinking of something a little more concrete than that: I would like at the very least for this project to leave behind it an online resource that will be essential reading for anybody who wants to attack the problem in future. The blog comments themselves may achieve this to some extent, but it is not practical to wade through hundreds of comments in search of ideas that may or may not be useful. With past projects, we have developed Wiki pages where we have tried to organize the ideas we have had into a more browsable form. One thing we didn’t do with EDP, which in retrospect I think we should have, is have an official “closing” of the project marked by the writing of a formal article that included what we judged to be the main ideas we had had, with complete proofs when we had them. An advantage of doing that is that if somebody later solves the problem, it is more convenient to be able to refer to an article (or preprint) than to a combination of blog comments and Wiki pages. Read the rest of this entry »

Frankl’s union-closed conjecture — a possible Polymath project?

January 21, 2016

Although it was from only a couple of people, I had an enthusiastic response to a very tentative suggestion that it might be rewarding to see whether a polymath project could say anything useful about Frankl’s union-closed conjecture. A potentially worrying aspect of the idea is that the problem is extremely elementary to state, does not seem to yield to any standard techniques, and is rather notorious. But, as one of the commenters said, that is not necessarily an argument against trying it. A notable feature of the polymath experiment has been that it throws up surprises, so while I wouldn’t expect a polymath project to solve Frankl’s union-closed conjecture, I also know that I need to be rather cautious about my expectations — which in this case is an argument in favour of giving it a try.

A less serious problem is what acronym one would use for the project. For the density Hales-Jewett problem we went for DHJ, and for the Erdős discrepancy problem we used EDP. That general approach runs into difficulties with Frankl’s union-closed conjecture, so I suggest FUNC. This post, if the project were to go ahead, could be FUNC0; in general I like the idea that we would be engaged in a funky line of research.
Read the rest of this entry »

Entropy and Sidorenko’s conjecture — after Szegedy

November 18, 2015

Here is a simple but important fact about bipartite graphs. Let G be a bipartite graph with (finite) vertex sets X and Y and edge density \alpha (meaning that the number of edges is \alpha |X||Y|). Now choose (x_1,x_2) uniformly at random from X^2 and (y_1,y_2) uniformly at random from Y^2. Then the probability that all of x_1y_1, x_1y_2, x_2y_1 and x_2y_2 are edges is at least \alpha^4.

The proof is very simple. For each x, let f_x:Y\to\{0,1\} be the characteristic function of the neighbourhood of x. That is, f_x(y)=1 if xy is an edge and 0 otherwise. Then \sum_{x,y}f_x(y) is the sum of the degrees of the x\in X, which is the number of edges of G, which is \alpha |X||Y|. If we set f=\sum_{x\in X}f_x, then this tells us that \sum_{y\in Y}f(y)=\alpha|X||Y|. By the Cauchy-Schwarz inequality, it follows that \sum_{y\in Y}f(y)^2\geq\alpha^2|X|^2|Y|.

But by the Cauchy-Schwarz inequality again,
(\sum_{y\in Y}f(y)^2)^2=(\sum_{y\in Y}\sum_{x_1,x_2\in X}f_{x_1}(y)f_{x_2}(y))^2
= (\sum_{x_1,x_2\in X}\sum_{y\in Y}f_{x_1}(y)f_{x_2}(y))^2
\leq |X|^2\sum_{x_1,x_2\in X}(\sum_{y\in Y}f_{x_1}(y)f_{x_2}(y))^2
=|X|^2\sum_{x_1,x_2\in X}\sum_{y_1,y_2\in Y}f_{x_1}(y_1)f_{x_2}(y_1)f_{x_1}(y_2)f_{x_2}(y_2).
That last expression is |X|^2 times the number of quadruples x_1,x_2,y_1,y_2 such that all of x_1y_1, x_1y_2, x_2y_1 and x_2y_2 are edges, and our previous estimate shows that it is at least \alpha^4|X|^4|Y|^2. Therefore, the probability that a random such quadruple consists entirely of edges is at least \alpha^4, as claimed (since there are |X|^2|Y|^2 possible quadruples to choose from). Read the rest of this entry »

Interesting times in academic publishing

November 10, 2015

In this post I want briefly to mention four current goings on in the world of academic publishing.

First, I’ll just briefly say that things are going well with the new journal Discrete Analysis, and I think we’re on course to launch, as planned, early next year with a few very good accepted papers — we certainly have a number of papers in the pipeline that look promising to me. Of course, we’d love to have more.

Secondly, a very interesting initiative has recently been started by Martin Eve, called the Open Library of Humanities. The rough idea is that they provide a platform for humanities journals that are free to read online and free for authors (or, as some people like to say, are Diamond OA journals). Perhaps the most interesting aspect of this initiative is that it is funded by a consortium of libraries. Librarians are the people who feel the pain of ridiculous subscription prices, so they have great goodwill towards people who are trying to build new and cheaper publication models. I think there is no reason that the sciences couldn’t do something similar — in fact, it should be even easier to find money.
Read the rest of this entry »

Gil Kalai starts Polymath10

November 5, 2015

I’ve already mentioned this on Google Plus, but my readership there is different from my readership here, so it seems a good idea to write a similar post here, to give what publicity I can to the fact that Gil Kalai has started a new Polymath project. The problem he would like solved (and I would very much like to see it solved too) is the famous delta-systems or sunflower conjecture of Erdős and Rado.

The problem, as with many problems in combinatorics, is easy to state, but fascinatingly hard to solve. It is a classic extremal problem, in that it asks how big some combinatorial object needs to be before it is guaranteed to contain a subobject of some particular kind. In this case, the object is a kuniform hypergraph, which just means a collection of sets of size k. The subobject one would like to find is a sunflower of size r, which means a collection of sets A_1,\dots,A_r such that we can find disjoint sets H, P_1,\dots,P_r with the P_i disjoint and with A_i=H\cup P_i for each i. I have used the letters H and P to stand for “head” and “petal” — H is the head of the sunflower and P_1,\dots,P_r are the petals. Read the rest of this entry »

EDP28 — problem solved by Terence Tao!

September 20, 2015

I imagine most people reading this will already have heard that Terence Tao has solved the Erdős discrepancy problem. He has blogged about the solution in two posts, a first that shows how to reduce the problem to the Elliott conjecture in number theory, and a second that shows (i) that an averaged form of the conjecture is sufficient and (ii) that he can prove the averaged form. Two preprints covering (i) and (ii) are here and here: the one covering (i) has been submitted to Discrete Analysis.

This post is therefore the final post of the polymath5 project. I refer you to Terry’s posts for the mathematics. I will just make a few comments about what all this says about polymath projects in general.
Read the rest of this entry »

Discrete Analysis — an arXiv overlay journal

September 10, 2015

This post is to announce the start of a new mathematics journal, to be called Discrete Analysis. While in most respects it will be just like any other journal, it will be unusual in one important way: it will be purely an arXiv overlay journal. That is, rather than publishing, or even electronically hosting, papers, it will consist of a list of links to arXiv preprints. Other than that, the journal will be entirely conventional: authors will submit links to arXiv preprints, and then the editors of the journal will find referees, using their quick opinions and more detailed reports in the usual way in order to decide which papers will be accepted.

Part of the motivation for starting the journal is, of course, to challenge existing models of academic publishing and to contribute in a small way to creating an alternative and much cheaper system. However, I hope that in due course people will get used to this publication model, at which point the fact that Discrete Analysis is an arXiv overlay journal will no longer seem interesting or novel, and the main interest in the journal will be the mathematics it contains.

The members of the editorial board so far — but we may well add further people in the near future — are Ernie Croot, me, Ben Green, Gil Kalai, Nets Katz, Bryna Kra, Izabella Laba, Tom Sanders, Jozsef Solymosi, Terence Tao, Julia Wolf, and Tamar Ziegler. For the time being, I will be the managing editor. I interpret this as meaning that I will have the ultimate responsibility for the smooth running of the journal, and will have to do a bit more work than the other editors, but that decisions about journal policy and about accepting or rejecting papers will be made democratically by the whole editorial board. (For example, we had quite a lot of discussion, including a vote, about the title, and the other editors have approved this blog post after suggesting a couple of minor changes.)

I will write the rest of this post as a series of questions and answers.
Read the rest of this entry »

Is Nick Clegg a Liberal Democrat?

April 28, 2015

All my life I have found that Liberal Democrat policies (and before that, Liberal-SDP Alliance policies, and before that, Liberal policies) have been, if not a perfect match to my own views, then at least the closest to them. In particular, I am broadly centrist, tilting somewhat to the left, strongly in favour of voting reform, strongly in favour of remaining part of the European Union, and very keen to take much more radical action to combat climate change. However, Nick Clegg is doing his best to persuade me that to vote Liberal Democrat is no longer a good way of promoting these values. Here is what he has said about building coalitions after the election:

As we have always said, the party with the most votes and the most seats in this election has the first right to seek to form a Government. The British people would rightly question the legitimacy of a coalition that didn’t allow the party with the largest number of seats and votes the opportunity to attempt to form a Government first.

I’m proud that the Liberal Democrats have proved we can form a strong and stable coalition government, able to bring prosperity to Britain.

Just like we would not put UKIP in charge of Europe, we are not going to put the SNP in charge of Britain – a country they want to rip apart.

The current projections at Five Thirty-Eight put the Conservatives on 281 seats, Labour on 268, the Scottish Nationalists on 49 and the Liberal Democrats on 26. If these are correct, then Clegg is saying that he will try first to form a Government with the Conservatives. I claim that this is inconsistent with all four of the fundamental Liberal values I mentioned.
Read the rest of this entry »

USS changes — don’t be fooled

March 12, 2015

This post is meant for anybody who will be affected by proposed changes to the Universities Superannuation Scheme, the body to which I and many other UK academics have paid our pension contributions and that now proposes to change the rules to deal with the fact that it has a large deficit as a result of the financial crisis. (Or rather, it says it has a large deficit, but there are arguments that the amount by which it is in deficit or surplus is highly volatile, so major changes are not necessarily justified.)

Of course, any change will have to be in the direction of making the deal less generous for those with pensions. Indeed, changes have already been made. Until a few years ago, the amount you got at the end was based on your final salary. More precisely, you got one 80th of your final salary per year after retirement for each year that you contributed to the scheme, up to a maximum of 40 years of contributions (and thus a maximum of half your final salary when you retire). But a few years ago they closed this final-salary scheme to new entrants, because (they said) it had become too expensive. This was partly because now a much larger proportion of academics end up as professors, so their final salaries are higher, and also of course because people live for longer.
Read the rest of this entry »

Sums of kth powers

November 4, 2014

Recently I was giving a talk at a school in London and had cause to think about sums of kth powers, because I wanted to show people the difference between unenlightening proofs and enlightening ones. (My brief was to try to give them some idea of what proofs are and why they are worth bothering about.) On the train down, I realized that there is a very simple general argument that can be used to work out formulae for sums of kth powers. After a brief look online, I haven’t found precisely this method, though I’ve found plenty of methods in the vicinity of it. It’s bound to be in the literature somewhere, so perhaps somebody can point me towards a reference. But even so, it’s worth a short post.

I’ll start with the case k=1. I want to phrase a familiar argument in a way that will make it easy to generalize in the way I want to generalize it. Let R_2(n) be the rectangle consisting of all integer points (x,y) such that 1\leq x\leq n and 1\leq y\leq n+1. We can partition R_2(n) into those points for which x\geq y and those points for which x<y. The number of points of the first kind is 1 + 2 + ... + n, since for each x we get x possibilities for y. The number of points of the second kind is 0 + 1 + ... + n, since for each y we get y-1 possibilities for x. Therefore, 2(1 + 2 + ... + n) = n(n+1) and we get the familiar formula. Read the rest of this entry »