Computing as a Subject

I've been thinking a bit lately about the identity of Computing as a subject. This has been brought on by a few factors, some recent, like a professional development day held by the local vocational training provider (South West Institute of Technology) where a local business council member looked at results of a survey of small business and the skills that they were after from young people coming from education and training. Of course I mostly paid attention to the technology skills, which were the same old song we've heard since I started teaching a decade ago: the Office suite, and not much else.

Since the pushback from the whole Digital Native vs Digital Immigrant rubbish from a while back I keep hearing from teachers about students who can't use "technology" (again, the Office suite for the most part) in their subjects for research and presentation. 

From a teaching perspective, there was little set in stone about what students actually needed to do in high school before the subjects tied to graduation kicked in, so schools largely did what they liked. Some focused on basic computer skills which students would need for other subjects and officey type jobs, some focused on media production, engineering, programming etc. It was largely up to individual teachers and their skill set. Personally, coming from a Computer Science background I always enjoyed teaching students about how computers worked, programming and networking, but discovered enjoyment in animation and digital publishing along the way. I begrudgingly taught Word and PowerPoint, less so Excel where I could look at how to make it more than a glorified table.

Enter the new Australian Curriculum. Amidst a global push to "teach kids computers" so that everyone can be the entrepreneurs of the future suddenly people are talking about computational thinking and how kids should all learn to code. Packed into the achievement standards is networking, security, algorithms, software development, project management user interface design and more. Very little of it could be mistaken for learning to type, using headings in Word or letter formats. A smattering could be taught using PowerPoint, and more of it with spreadsheets like Excel.

I should point out that this isn't the extent of technology in the curriculum. This is merely the specialist area of "Digital Technologies". There is also a list of general ICT capabilities which are intended to be embedded across the rest of the curriculum. This is where the traditional "computing skills" lie; how to present information, reliable research, recording information and so on.

This leaves us as educators in an interesting situation: where do students learn these skills and develop their understanding of the basics? The other subject areas will often be quick to tell you that their own areas of the curriculum have become more crowded with their specialist content, leaving them time poor, and there are a significant number who will tell you that the responsibility for teaching the "how" of almost anything with buttons or a screen rests with IT. We have seen that the hordes of supposedly IT savvy teenagers don't necessarily come equipped with the right skills out of the womb. So who does the teaching? Even if it becomes mandated that these general capabilities are taught in the areas of study in which they are needed, there are (and will be for quite some time) many teachers who don't feel confident enough in their IT skills to teach the tools which are becoming more useful in their subjects (this is of course a sweeping generalization, and there are plenty of teachers who do great things with technology whilst coming from a non-tech background). I don't have a good solution to this situation, but then is it actually something which we should be caring about?

Let's go back to the small business survey. As a teacher who has taken vocational subjects over the years, we use surveys like this to try and motivate students who have poor IT skills who are looking at entering the workforce before graduation, citing the number of non-tech businesses which utilize technology quite heavily and want new employees who can do this aspect of the job. We are going to be entering a situation where up until the end of year 10, IT as a subject will be about creative and development skills, and a deep understanding of how technology can be used and can affect our lives and workplace. When students go into year 11 and 12 they have a choice between university bound subjects where these skills continue to be developed and jumping tracks to vocational skills which bear no resemblance to computing as a subject up until then.

My question from this is: does this need to change? Does business actually know what will be useful for it 5+ years down the track, given how notoriously difficult it is to predict the future of business and technology (which always makes me laugh when someone is introduced as a "futurist"). Businesses that have done well in recent times have been those that embraced technology in ways to make themselves more efficient and/or to exploit different markets, leaving others to scramble to catch up, or to atrophy.  Just look at the rise of things like AirBnB, Uber, and going back to the likes of EBay and Amazon.

I am wondering when we will get to the point where small business starts to look at the comparison of someone with the traditional Office-y sorts of skills who can do an office-y sort of role quickly and effectively compared to someone without those skills (but still an understanding of the business), with someone who has a deeper understanding of technology and analytical skills, who can examine a situation and use that understanding to solve that class of problem, be it through automation, more effective application of technology etc, and then use their time in some more productive way. In other words, someone who can automate themselves out of a job, but apply their skills to a more productive job in return.

This hypothetical situation is of course pretty outlandish right now. Those with the sorts of skills described mostly fall into the academic pathway for the time being, but what will happen when those who have grown up with the new digital technologies curriculum start leaving school? The ideal outcome is that not everyone is going to become a programmer or a software engineer or a database administrator or a network engineer. The ideal outcome is that no matter what industry and field of interest school leavers fall under, they should have the understanding to apply technology to it to make it more efficient or to explore options which were previously unavailable to them.