Would it be possible for you to put a copy of the article you mentioned, “Mathematics, Memory and Mental Arithmetic,” online, or to email you for a copy (if allowed by the publisher)? This is definitely the sort of thing I have had in the back of my mind, but I would be very keen to see others’ thoughts on it.

]]>I, for one, would love to read your article “Mathematics, Memory and Mental Arithmetic.”

Btw Eric Mazur has an interesting video on understanding vs memorization in teaching physics https://www.youtube.com/watch?v=WwslBPj8GgI

]]>
]]>

Seconded, and you can also insert “Canadian”.

Also, at a reasonable number of UK universities, it isn’t until the 2nd year or later (of the “major”) that students learn what a field is.

]]>Perhaps at a US university, before people have chosen their majors, people might be tempted to choose another option (such as B, because vector spaces are to do with algebra and not calculus), while not noting that the obvious scalars in D do not form a field.

FWIW, at most US universities, I think that most students who haven’t chosen a major yet won’t know what a “vector space” or a “field” is.

]]>On your last point, Barton has a collection of questions he calls SSDD (same surface structure, different deep structure). They aren’t trick questions at all — just questions that can’t be done on autopilot because one has to do exactly what you advocate — that is, get behind the surface structure to see what the deep structure is.

For example, instead of giving lots of questions where one has to work out one of the sides of a right-angled triangle given the other two sides, one could have questions where you have to work out the area, the altitude when the hypotenuse is the base, the perimeter, and so on, so that students don’t go into “It’s a right-angled triangle, so I must use Pythagoras” mode.

]]>I tried to give some indication of this in the post. The idea is to predict (using one’s experience of teaching) the incorrect reasoning that people are likely to make, and to give the answers that that incorrect reasoning yields. To give a very simple example, if you asked them to subtract 28 from 80, then a good choice of wrong answer would be 62, which one obtains by noting that it ends with a 2 and that 8-2=6. (So actually, contrary to what Deane Yang writes, the point is very much to worry about the reasoning that the students use.)

I don’t think designing good diagnostic questions and answers is all that easy. But Barton (I think) has created a website devoted to such questions, so teachers don’t all have to do it for themselves. Unfortunately you have to register to look at the questions, but at least it is free.

]]>First, we use such questions as diagnostics (in addition to straightforward arithmetic and algebraic computation problems) to see whether a student has been properly trained or not in his earlier math classes. If they do poorly on problems which require properly interpreting the meaning of the text, then they are not ready for university level math.

Second, I don’t see how using “pairs” is at all sufficient. Certainly, the first problems given to a student should be straightforward. However, after that, almost *every* problem should be designed so that it cannot be answered using a superficial reading. What is true is that every so often you should throw in a problem where the obvious answer is the correct one. The latter is mostly for building a student’s confidence that they really can understand correctly what is being asked. If a student successfully learns how to find the deep structure of a problem, then a problem where the superficial and deep structures agree and the answer is obvious can be quite unsettling to the student.

We do not want to teach a student only to distinguish superficial and deep structure. This feeds the view that math problems are “trick questions”. We want every student to immediately step around the superficial structure and look *only* for the deep structure. A question is not a trick question, if it can be solved by a straightforward rigorous reading of the question. The goal is to make them as fluent in dong this as it is for us (which is to say that it will never become easy). Only then should we say that a student knows how to do math.

]]>The point is to not worry about the reasoning the student used. The probably that a student can score well on a well designed diagnostic test using mostly faulting reasoning is extremely low. Well designed means that you give questions where the obvious guess won’t work.

Designing a good diagnostic test for a particular course is not easy. It requires someone who has not just a lot of experience teaching the course but has already devoted a lot of attention to diagnosing why students taking that course do poorly. Without firsthand experience, it is *impossible* to know what to look for. I say this, because I went through this myself. I taught university level math in the same naive way that Barton did. for many years. Then my department hired an instructor (Jerry Epstein, now passed away) who knew the math education research about why so many students and adults fall short in math and who had devoted a lot of effort designing and validating diagnostic tests. The questions were *below* high school level, but even at top US colleges, as many as 10% of the students (but not necessarily math majors) did poorly. He explained to us how this all happened. We of course started administering the test to our students, which confirmed everything he said. After this, we redesigned our diagnostic test (which included a careful validation process) and tried our best to change how we taught our precalculus and calculus courses.

I also recommend looking at the first chapter of Calculus by Hughes-Hallett which has excellent precalculus problems, which I feel are good questions testing a student’s readiness to take calculus. Here is one of my favorite ones:

Find the exactly value of arccos (cos 4).

If you view this as a trick question, you are missing the point. Students should be learning math as a meticulous logically rigorous process and to solve this problem in that fashion. If they answer “4”, they have not learned how to do math correctly.

]]>bit on a tangent, how do you go about picking the wrong(?) choices

for such a diagnostic question?

About a month ago, I wrote an to my son’s physics teacher, claiming that his school should offer a course, or at least a few lectures, on “how to think” — the example I gave was “how to think about things that don’t quite fit in your brain” and the specific example I gave was a geometry proof where things got much easier if you just started writing down all the facts that were obvious.

Not claiming my thoughts were original (clearly not) but I just bought Barton’s book to see what else is in there, and would love to see your article if you can somehow make it available.

Thanks for a fascinating weblog.

]]>I had come to more or less the same conclusion regarding short-term memory based on my undergraduate education. While tutoring my friends in the Calculus and Linear Algebra courses, I found that the most common problem they faced was being able to hold many ideas in their mind at the same time. I believe that the large number of problems that I had to solve throughout my schooling helped develop my fluency in basic manipulations. I like to think that it frees the memory by pushing those skills to an instinctive level.

Regarding diagnostic questions, I have felt that some of the entrance exams in India for undergraduates and graduates do design their papers in such a manner. The purpose may only be to weed away the less competent students rather than to improve their education, but I’m sure it is possible to design such test papers for use in the university classrooms. I cannot say how much work it will involve on the part of the instructors, though. . .

Luckily, Barton’s book is available in my country at a reasonable price, and I will be buying it at the earliest.

]]>Eric Mazur had a similar experience when teaching Physics at Harvard; he gave a lecture about it, which is available here: https://youtu.be/WwslBPj8GgI

Another couple of talks which might be of interest: Sandra Laursen on a research project on the impact of IBL (Inquiry Based Learning): https://youtu.be/nEBkk1QfA0k and Michael Starbird asking what we want our students to get out of our math(s) courses: https://youtu.be/VVSaNNrkeEM

Best wishes

]]>There is indeed a parallel, and it’s one of the reasons I found that particular passage so interesting. There is a distinction between forwards reasoning, where you explore the consequences of what you know, and backwards reasoning, where you use what you know to reduce what you are trying to prove to something easier. For what Mohan and I were doing, backwards reasoning tended to be the mode of choice, because in practice that tends to lead to less search. But that isn’t always the case: sometimes there are lots of potential reductions of what you are trying to prove, without any of them standing out as being particularly good, while in the other direction there is the possibility of making an “observation” that, once made, is clearly helpful.

I’m amused by what you say about those books. I read one myself, which I found in my school library, about how a radio works. I remember showing it to my father (this would have been in about 1978, so indeed 40 years ago), whose reaction was “That’s a good account of how radio worked about thirty years ago.” Even if it was out of date, it conveyed to me the basic idea of amplitude modulation, which has stuck with me ever since.

]]>On how to get students to think about a problem – for example, “fill in as many angles as you can” rather than “show that the triangle is isosceles” – is there any parallel with the ways in which good theorem-proving programs work? The program you prepared with Mohan Ganesalingam a few years ago was I think designed to think like a human mathematician. Obviously that was working at a much more sophisticated level, but did it use comparably indirect approaches along the lines of “Never mind the problem set, let’s work out everything we can about this diagram/series/topological space and then see where that takes us”?

On using wrong answers to multiple-choice questions to diagnose errors and provide appropriate help, something very similar was in vogue about 40 years ago under the name of programmed learning. There were books like that: “What is the answer? If A, go to page 32 [which would take you to the next problem, if A was the right answer]. If B, go to page 137 [which would explain the mistake you had made, set you another problem on the same point, then if you got that one right send you to page 32 to rejoin the main stream]. If C, go to page 48 [where your mistake would be explained, etc.]” There were also machines with the pages on rolls of film and keyboards to select your answers (the film would scroll back and forth in response). Now I think we would achieve the same effect with less clanking and whirring.

]]>