Max's vocabulary started exploding around 15-17 months old. I started writing down everything he could say (with understanding of what it means) starting at 18 months, when he could say 107 words. I stopped when he hit 1000 words, a day before his second birthday. During that time, he learned an average of five words a day. Here's the graph:
I did kinda want him to hit 1000, so I spent extra effort teaching him words in the last nine days.
Out of the 1000, he knew 248 Chinese words, 738 English words, and 14 words in other random languages. He learns most of his Chinese words from me, and some at daycare. I speak mostly Chinese to him, but it's not natural for me, so sometimes I switch back to English, plus Chloe speaks all English to him, which is why his English is way ahead.
What kind of words does he know?
I always love it when people share revenue data for their apps / games / books / works, and it's been four years since The Motivation Hacker came out, so here's another updated graph of ebook sales by platform including the second year. (See also Second Year Book Sales, First Year Book Sales, Aftermath: The Motivation Hacker)
I forgot about checking the numbers for year three, and now it's been just over four years, so I thought I'd take another look. Looks like sales have more or less held steady on Kindle and the other ebook platforms, but CreateSpace paperbacks have come out of nowhere, so year four was better than year three or even two. Cool! (CreateSpace paperbacks cost more than ebooks, $7.99 to $2.99, but I set it so that the royalty is the same either way, about $2.21.)
Ratings kept falling on Amazon, from 4.4 two years ago to now 4.1. Goodreads inched down from 3.85 to 3.81, with a combined total of 633 ratings. I wrote the book quickly (about 200 hours total) and have now made about $17K, or around $86/hr. Not a huge amount, but not bad for something that I initially thought only 50-100 people would read.
Now I have two kids. So far, it's pretty similar to just having one kid. The time and energy needed to sustain double the lifeforms can be obtained almost entirely from further lowering of parenting standards. There are a lot of good jokes about this. You can see from this blog how it goes. Max got a post at one month announcing his existence and another at four months about parenting surprises. Clark? He's eight months now, and this is all he gets.
Anyway, behold how cute he is:
Humans are naturally bad at predictions. Being bad at predictions, and figuring out how to make better ones, sounds kind of fun but not necessarily a basic life skill. Like, you can say, "I bet I'll spawn a baby infant by 2020 with 80% confidence!" or "Oculus Rift is going to be a market success - 65%", but you can't just want to make a really close friend and say, "I predict that I will make a really close friend this month with 95% probability" and magically have it happen.
Or can you?
I mean, the kinds of things you can try to do to make close friends are known. Get out of the apartment, do social events, reach out to people you like to hang out, be vulnerable, open up your schedule and routine to allow for repeated and unplanned interactions, practice your social graces, spend less time at work or with people you don't actually want to be close friends with, wear a really cool hat, and so on
But it feels like there's a lot of uncertainty on 1) whether you will actually do these things and 2) what the results of each effort will be. You vaguely wonder, "What if they don't like me?" "What if I don't like them?" "Does anyone really have time to make new close friends?" "What parties would I even go to?" and so on, until it feels like it probably wouldn't work. So you don't attempt any of these things, and so of course it doesn't work.
I use a site called PredictionBook to make predictions, say how likely I think they are to occur, and then later judge whether I was right or wrong. Over many predictions, this improves my calibration on knowing what will happen on the future.
The reason this is super useful is that pretty much all humans are overconfident in their predictions. As one blogger put it:
Nearly everyone is very very very overconfident. We know this from experiments where people answer true/false trivia questions, then are asked to state how confident they are in their answer. If people’s confidence was well-calibrated, someone who said they were 99% confident (ie only 1% chance they’re wrong) would get the question wrong only 1% of the time. In fact, people who say they are 99% confident get the question wrong about 20% of the time.
So if I say I think my team will ship Feature X before March with 80% confidence, I want that to happen 80% of the time, not 55% of the time. If I think we will be 95% likely to hit our sales targets, then I'd like us to miss only 5% of the time. I want to be well-calibrated. This is really important not just for business strategy, but also for day-to-day decisions. If I'm "pretty sure" that I'll have enough time to hit the gym before my meeting, then I don't want to miss my meetings because my "pretty sure" is a human's normally overconfident way of judging a "slightly probable" situation.
But it's hard to realize that when you say something is 99% likely, it only happens 80% of the time, unless you are keep tracking of your predictions. So, I practice using PredictionBook. Over time, I have become less confident at the high end of certainty and more confident at the low end:
In language learning, there's a slightly obscure but useful concept of "extensive reading" vs. "intensive reading". See this article for more discussion. Basically, if you are reading easy stuff quickly with very few unknown words, you are reading extensively. If you are reading difficult stuff slowly and having to either skip or look up many words, you are reading intensively.
Almost all learning materials tend towards intensive reading, whereas extensive reading is more effective and fun. It's very difficult to get large amounts of foreign language text using words the learner knows that just happen to have very few unknown words, especially in the beginning when the learner doesn't know many words. Thus the textbook. Ain't no one like the textbook. Eventually you can read a translation of Harry Potter, but most students don't get there. Especially in languages like Chinese, where reading is extremely difficult and slow and one can't rely much on cognates for unfamiliar words, learners really only get intensive reading, and their reading speeds stay very slow forever (if they don't give up entirely in frustration).
We can broaden this concept of "extensive reading" beyond reading into "extensive learning". For math, it would be very quickly doing a lot of problems, almost all of which are trivial, like one might do in a brain training game. (In school, math assignments tend to be focusing only on new techniques which are difficult, with problems that take a while to learn and do, and not many review problems.) For history, it would be reading stories about events you already know well, so that the new details are few and easily hung on the tree of your existing history knowledge. (Traditionally, "studying history" is usually about trying to absorb a ton of facts about an entirely new historical scenario, and it's hard to remember afterwards.) So on, so forth–the pattern is that properly paced extensive learning is actually kind of fun, but schools typically have to rely on intensive learning, to the detriment of retention and engagement.
What does extensive learning look like for programming? Writing lots of code you know well many times, with only a few new things here and there, not slowing you down much. CodeCombat was designed to do this as much as possible. I think we have a ways to go before it really feels extensive without being boring for all students, but we are doing a lot better than other learning methods. One key is to keep the players writing familiar code, so the pacing isn't difficult, but to keep it from being boring using gameplay. The other is to never stop long to make learners do something other than coding (like reading too many instructions or watching a long lesson).
If we think of learning programming like learning a foreign language, it's clear that you want to have a lot of extensive conversations with the computer. I think many programmers take a long time to become fluent, and a big reason for that is that they're almost always in intensive mode–writing some code that's difficult and halting, spending more time reading documentation and debugging than actually expressing themselves. CodeCombat is my attempt to save them. Our goal is that our players will be native speakers of code.
Here's another concept from language learning pedagogy (see also Extensive vs. Intensive Learning): the difference between active vs. passive learning, or production vs. recognition. You learn faster and retain knowledge much better when you practice it actively, rather than passively. Okay, sounds kind of obvious, right? But A) the effect is probably more extreme than you think, and B) just about everyone teaching, making educational software, or even trying to communicate ideas in a meeting gets it wrong all the time, so we clearly only passively know that active learning is better. ;)
I'm interested in talking about a focused definition of active and passive learning that goes beyond "doing math problems is better than listening to a lecture on math", about the exact details in which you practice a skill. In language learning, we call it production vs. recognition: actually saying or writing a word vs. just hearing or reading it. We learned the hard way at Skritter that seeing a new Chinese character, even if it's relatively simple and you see it all the time and you know how to write other characters, does not do you any good in terms of knowing how to write that character. You only have a recognition knowledge of it. A hilarious example came from a guy who was pretty good at Chinese who could speak okay and could read over 1600 characters, but never learned to write. He checked out Skritter, tried to write the character 你 (nǐ, which means "you")–something that he's heard, said, and read tens of thousands of times, and he realize he had no idea how to write it!
By that point, we weren't surprised at all, because we had tried many variations of the "let's show you the character first if you have never seen it before", but they never worked–the only thing that worked for people learning new characters was to try to write the character. So in essence, having a flashcard with the character on it was virtually useless. You could see the prompt ("nǐ: you"), flip over the flashcard, and rate whether you "knew" the 你 that you saw, but unless you mentally or physically drew it out, you would never actually learn the character, even in hundreds of tries. Basically, we built the entire Skritter business of the ramifications off the fact that it was very hard for people to make flashcards for Chinese that would let them do active learning.
One non-language-learning example I've heard is a good one: you would know a $5 bill if you saw it, right? Even if it you didn't see the number 5 on it? Not too hard, given the number of times you've seen a $5 bill in your life. But now tell me, what are some of the things on a $5 bill? ... I just tried this and not only did I get the dude wrong, but I couldn't remember any of the other stuff on it. Try for yourself, and then check your result. Recognition and production are almost totally separate skills! And recognition (passive learning) is a horribly inefficient way to learn anything, 'cause you won't remember it.
So of course with CodeCombat, we try to avoid time spent on passive learning / recognition activities and instead have the player focus on producing / actively learning. They're spending almost all their time coding! That's a big part of why players learn a lot faster than with other resources that would rely on a video / lesson / explanatory text talking about how to use code before the player actually gets to try to use code. And don't get me wrong: that stuff has its place, too, just like intensive learning has its place–but it should definitely be a minority of time spent, a distant second to the active learning practice.