A Calculus goal was to be more upfront about where some of our patterns come from this year. I really disliked Named Theorems. For example, I have found no purpose if having kids learn the name "Fundamental Theorem of Calculus." It becomes a parrot moment. You say "what's the Fundamental Theorem of Calculus?" and I'm sure they can rattle off the pattern. But is there any connection to what that actually does? Does it improve their ability to calculate a definite integral with that name? I think it's debatable. Talking about final minus initial position has served me better.

Shoutout to my time at the Desmos Fellowship for the following revelations. Sam and Sarah blew my mind with the area model for a derivative of two products. The proof is so beautiful and perfectly suited for high school kids. WAY better than the awful derivation from the limit definition that flew over my head. I used it during Hurricane School with great effect. The question was prompted, well what about the derivative of a quotient? Does that have a proof? I didn't have an immediate answer, but YouTube to the rescue. The relationship made at the beginning was brilliant.

Here are both proofs written out:

Just so simple. And easily discussed in a few minutes. AB got to the product and quotient cases some time later, and I made a point to show them where the patterns come from. I wasn't super interested in them regurgitating the proof, I just wanted them to follow the logic. We were able to have some great discussions about the communicative nature of the two setups. The terms in a product derivative can go in whatever order we choose. A quotient requires us to be more careful. Later we saw that the product can scale to more than two functions.

I decided to cover all our derivative use cases now, rather than coming back to them. Next was exponentials and logs. Twitter to the rescue:

Later Dave Cesa dropped this bomb:

Once you establish e^x is its own derivative, you can show the logic for a natural log:

Again, so great and simple. Props to me for having the foresight to cover implicit derivatives (and isolating dy/dx terms) prior to this, so the dy/dx logic wasn't crazy bananas to them.

In the end, yes, most kids will just learn the patterns as a matter of faith. They will forget these proofs and will probably never reproduce them unless they grow up to be math teachers. But it was important to me to show them that math isn't magic, that we can reason our way to new ideas. Don't take the product rule on faith because I said so.

I can tell you as a student seeing something like this would've been so helpful. Calculus, despite having a great teacher, was a series of rules I felt I had to learn for the sake of learning them. It was many years before I realized how it all fit together.

Posted
AuthorJonathan Claydon