Thursday 13 October 2011

The Duplo®code Fallacy

I've just been watching this BBC news article, which features luminaries from the UK industry talking about the woeful programming skills in the new generation of kids.

Let's make a case here: not only does the education system have computing wrong because we focus on ICT, but the programming industry has the wrong emphasis on computer science, because they believe in teaching kids using powerful environments on powerful systems; when what we need is dirt simple systems and environments.

The case is really quite simple. Firstly, let's consider the programmers in the article: Ian Livingstone (2:14), David Braben (4:00), Alex Evans (6:36). They all learned to program on simple computers. Look at the clip from Making the most of the micro (3:25): it shows a BBC micro, a computer ready to program as soon as you switch it on and a listing from a printer containing a hundred lines of code.

Secondly, we learned to program without the aid of a Computer Studies class: instead our parents bought computers for us and there was basically no access in schools. By the time we got to the class we already knew more about programming than what an entire 'O' level would tell us. The classes just helped us do more of what we enjoyed.

Thirdly, let's consider the opportunities kids have to program these days. You'd think, they would be 'better' than in the 80s, because for all the thousands of hours kids and adults spend on a desktop or laptop computer they're only 2 clicks away from getting into code, only 2 clicks away from Javascript. Or, you could drop into a terminal window and hack out a simple java/C/shell script program within seconds

So why don't they? If more powerful computers and sophisticated environments are what you need and this kind of thing is available now why are there far fewer kids learning to program? Think about it: programming is 2 CLICKs away today, but our generation had to wait until we'd gone home and finished our homework before setting up our puny, slow, memory starved computers with grotty low-res screens, crude languages and unreliable tapes before we could even start doing anything.

How can it be that computers are >10,000 times more powerful and yet an order of magnitude less appealing to program?

The clue has to be in the question: it's the power itself that acts as an obstacle. Look at the 3 programming clips: There's the BBC micro clip, there's no obstacle because the display is directly programmed and you have a nice listing so you can see everything - a few hundred lines of code. It looks fearsome, but it's nothing compared with the David Braben example (4:09). Here, the screen shows a relatively sophisticated environment: there's at least 5 different screen panels; multiple tabs to access different options; a massive screen and you can 'edit' roughly 15 lines of Duplo® code. This is on his 'simple' Raspberry PI computer, which contains 300 million lines of code.

It's that kind of gob-smacking contrast that should make us wake up: all that power and sophistication driving 15 lines of Duplo®code. The computing culture today uncritically equates power and complexity with being better, which means we try to solve problems like the lack of programming expertise by throwing powerful tools at it. It's the powerful tools that are the problem when it comes to learning this stuff, not the solution.

This is why:
  1. People are put off programming, because the tools we use are geared for other tasks. For example, I'm writing a blog instead of coding. It's easy to blog and there's lots of webby distractions, so the effort/benefit ratio of coding is much lower.
  2. Kids get put off programming because the environment is complex. I don't want to boot up an IDE and learn its arcane windows, menus, language, libraries, syntax and frameworks when I start learning. Instead I just want to get a kick out of doing something like making the computer display my name 1000 times: : hi 1000 0 do ." Julz is FAB!" loop ;
  3. Kids get put off programming because the many layers of software adds too much guff to the effort. If I view source on a web page to see Javascript I find it's wrapped in Html (because Javascript is built on browser technology) and outputting to a screen or getting data from a keyboard or mouse or whatever is just so much more effort than on the BBC micro where David Braben learned his skills. The guff affects the third coding example on the video at 8:37, they're editing SQLite database stuff - and that's exciting? Well, of course, it's more exciting than learning Excel!
  4. Kids will get put off programming because creating Duplo®code environments on top of sophisticated systems is patronising and deceptive. Any 7 year old will realise that the real computer is nothing like what they're learning and that the linguistic padding (e.g. the use of the 'green' colour in the language) is magical: that is what's really going on hides a wad of complexity you don't have access to.
  5. Finally, kids will get put off programming if there's a hierarchy of access. Even if I ever think Duplo®code is a real language, my programs will never really run on a real PC/Mac/Linux/Nintendo/XBox/iPhone and be distributed on an equal footing with everyone else's code. It's a world away from when we learned to program when our code ran on computers people really owned and we could have it published in magazines or distribute it ourselves on tapes. So, the kids know... in their hearts... Duplo®code is a waste of time.
It's great that there's initiatives to encourage kids to program today. But our plan is like teaching kids to read using War And Peace, but only letting them read the simple words while we do the rest and teaching them to write by letting them arrange paragraphs and chapters. Nobody in their right mind would teach kids to read or write in this way, but this is exactly what being proposed here.

The real lesson to learn is that we need simple, but real systems. Simple systems have no distractions; they're easy to access; they have a clean and simple syntax; they're not patronising and deceptive and it's egalitarian. Simple systems were the way we learned - that's why it worked.

4 comments:

Martin Callaghan said...

Indeed. Perhaps the main issue is that many young people see their computer as a domestic appliance, They would no more need (or want) to program their computer than they would their washing machine.

Snial said...

Sure. Reconceptualising the computer as an appliance was a conscious shift in the 80s.

http://www.folklore.org/StoryView.py?story=The_Father_of_The_Macintosh.txt

And important too, because historically they were frighteningly hard for ordinary people to use. And now we're starting to see that the metaphor doesn't solve all the problems: we need people to see beyond the appliance so they'll be able to understand and so define its future.

-cheers from julz

Anonymous said...

Came across this after reading your original post about Xubuntu on G3. I agree with the post. I learned to program on a ZX-81 (1K) and BBC Micro. I totally agree that the 'O' Level in Computer Studies simply gave us time to do more of what we enjoyed... Programming. Spot on. To show my son programming I loaded a ZX emulator onto my PC. Sad ?

James said...

The complexity is a big hurdle. Anyone can install development software on their PC and teach themselves how to code. But that's like saying anyone can build their own car by sticking some metal together and learning to weld.

While we might not like the Duplo-style of "programming" that is currently the fad, it is at least making kids more aware of the concept that they too can make their computer do things and that it's not an appliance that comes sealed like their XBox.

We need to work hard to eradicate the dumbed-down thinking that has slowly infected the general population's brains. You know the kind - "Oh yeah lol I clicked a button and now I have viruses... oh well whatever" or "I couldn't find the Internet after you put that Firefox thing on so I had to make a new Hotmail and now all my emails are gone". That low level total non-understanding of what a computer even is.

They totally do think it's a microwave and have just as much interest in its workings.

Until we sort that mess out we'll never have a new generation of skilled programmers. I'm a secondary school teacher, I'm also a programmer and general computer nerd (after writing this I'm off to mess about configuring a mailserver for fun)... this seems to be a rare combination.

Mind you, this is the first time in six years I've actually been able to teach programming. Until now it's been good old ICT.