April 19, 2009
How Are Kids Learning To Program These Days?Here's an article on a topic that I will have bored people who know me about on many occasions over the last few months; how do kids learn to program these days?
This really, really bugs me. We have an unsustainable situation emerging in software development. Computing devices - devices that contain software of some kind - are becoming ubiquitous and the fabric of modern society is increasingly reliant on software.
At the same time, people in general are becoming increasingly removed from the inner workings of those devices upon which they're increasingly relying.
Extrapolate forward a generation; who is going to be writing the code that runs our civilisation?
When I was a youngster growing up in the late seventies and early eighties, computers were just finding their way into the home and GUIs were not commonplace. To interact with our Commodores and Sinclair Spectrums we had to type in interpreted commands. We had to program our computers to operate them.
And, although schools couldn't afford the number and sophistication of computers that they use today, we actually programmed a bit at school, too. We did a bit in Physics class. We did a bit in Maths class. I even did a bit in my A-Level art class when they bought an Amiga. In those days, when we still had high hopes for our technological future and kids actually wondered how things worked, smart teachers saw programming as a useful skill that pupils should try to learn if they got the chance.
I think that computer programming is a practical skill. It's a language skill. Learning to speak "computer" is a process not at all dissimilar to learning to speak French or Italian. You copy. You watch the reaction or effect of what you just said. Then you feed back that reaction and learn to speak it better. Bit by bit the language centres of your brain get a grip on the language and eventually it becomes an unconscious skill, allowing you to focus on what you're saying and taking care of details like syntax and grammar in the background.
The best time in life to start learning a new language is when you're young. British kids start learning French as young as five. Most of us start when we're eleven. It takes years of trial and error with a language - using it and seeing what effect/response it gets - to become fluent. Over hundreds and thousands of hours of practice, most people can become pretty fluent in just about any spoken and written language.
Computer programming also takes hundreds of hours of hands-on practice, and is by no means so difficult that an eleven year old couldn't make a useful start.
So why (oh why oh why oh why) are kids not exposed to computer programming at school? Why, given just how critical software is to the functioning of modern society, are we waiting until they get to University before they're exposed to real programming?
People tasked with teaching computer science to undergraduates tell me that the high drop-out rate on some of these courses is down to too many people falling foul of the (still amazingly scant) programming modules. Indeed, many CS lecturers don't actually program, either. So there's a lack of computer-speakers to lead by example. Imagine starting a degree in Linguistics having never spoken a foreign language. Or a degree in Music Theory have never played a musical instrument.
I really think we're headed for a major fall here. Not just our industry, but the computer-dependent society we're building. It's just not sustainable.
Kids should be programming at school. Not just for a semester or as an after-school activity. They should be programming for a few hours a week, every week, for 2-3 years. Your average 14-year-old should be fairly competent with a language like C# or Java or Ruby, capable of writing programs that solve realistic problems in the order of magntitude of a few hundred lines of code. There should be a GCSE in Computer Programming, and an A-Level in Software Development. Students should be able to take "double computing" (CS + Programming), and CS undergraduates should be arriving at University already able to write fairly sophisticated programs, write unit tests, work in teams, do continuous integration and all the foundational skills that a developer will need.
It's NOT rocket science. It's not beyond the wit of a school kid. We proved that in the eighties. How many computer games were written by school kids? Calculus is conceptutally much harder than programming, and we teach that at A-Level. (Well, we used too, at least. These days we seem to hand out qualifications to people who can tie their own shoelaces...)
If our educators and exmining bodies won't fund and run these courses, then we should. Or else we risk waking up one day soon in a world where everything runs on software that nobody understands. Have you seen the film Idiocracy? Yes, it'll be a lot like that. You have been warned.
Posted 9 years, 1 month ago on April 19, 2009