Where is the Next Generation of Developers?By Craig Buckler
Without wanting to sound like an old man, I am concerned for the next generation of developers. Fewer pupils are taking IT courses and computing is often replaced by “softer” subjects.
I consider myself fortunate to have been around at the dawn of home computing during the early 8-bit days of the 1980s. The industry was revolutionary, exciting, and – even better – our parents were totally clueless about the technology. The hardware may have primitive and the software could be awful, but computers were accessible. You could buy a Sinclair ZX Spectrum and be writing programs within an hour (the original Speccy came with an excellent BASIC manual).
Unfortunately, there is evidence that pupils are shunning IT even though most western schools have fantastic computer facilities. UK computing students decreased by 20% between 2004 and 2007; the largest drop in any subject. A survey of over 2,000 secondary school pupils concluded that:
- computing as a subject had lost its novelty factor
- pupils thought computer science suffered from a geeky image
- many thought computing was too hard or did not understand what the subject involved
- computing is considered to be “boring”
- studying IT was only necessary if you wanted a job in the industry
- computing was regularly used in other subjects, so there was little need to study it.
Several other factors are also evident:
1. The rise of games consoles
Gaming is one of the most important reasons for buying a computer. Since the early 1990s, dedicated consoles have offered an inexpensive route to arcade-quality games in the home. Although children can access PCs too, the majority prefer to play their games machines. Unfortunately, that is the only thing a console can do; programming and experimentation is not possible.
2. Too many distractions
PCs are great; you can sit at them for hours and do precisely nothing. Surfing the net, chatting with friends, and twiddling with photos can distract you from doing anything else. Why should a child want to learn programming when every application they could want is already provided?
3. Starting development is too tough
Many programmers will berate me for this, but interpreted BASIC was a great first language to learn. It is easy to start, provides instant feedback, and teaches you the essentials. Most 8-bit computers booted into a BASIC programming environment so you were encouraged to try out commands and small programs from day one.
What can students use now? VisualStudio and similar IDEs are far too complex for beginners. Forms-based Window UI development and web-based client-server programming has a steep learning curve for all but the most dedicated novice developer. Although child-friendly languages are available, many are too simplistic and the majority are niche environments that few people know.
Are you a computer science student? What language did you learn first? Is the subject useful and well-taught? Will the IT industry suffer as fewer pupils choose computer science? Or is IT so ubiquitous that people naturally have the required skills?