Now, I’ve already talked a little bit about how we can be teaching better. But today I’ll focus more on how to fix the course/assessment gap, directly answering the problem addressed in the last post.
So, the way I look at it, there are two solutions. One of them bad, which I’m sure is part of the problem facilitators are trying to avoid. The other is good, which I’d say facilitators feel unable to do (for many reasons, but I’d go with that they don’t know how). I’ll even put in a third one cause I’m in a good mood.
The first solution, which is what’s wrong with high school education, is standardised testing. That is, test exactly what is taught. Sounds nice doesn’t it. But what that boils down to is theory exams, or specific questions testing specific skills, like “the banner is this, send the exploit” or “do the buffer overflow” (although to be fair, that is one of the OSCP questions) or “SQL inject this page.”
This is less than ideal. Sure, you’ll have the skills down, but you’ll only have hammers in your tool belt. You do need to do exercises to get the skills down, but it doesn’t make for a good assessment because that’s not what security is like in the real world, and ideally, we want to prepare students for the real work.
I’d say facilitators are, for good reason, avoiding this type of assessment. The problem is that instead of “SQL inject this” they’re going “hack this (but don’t worry, you can’t because it’s webdav and we made sure you haven’t heard of it),” and calling it “problem solving” or “research” or whatever. I think we can do better.
There’s two options, change the course, or change the assessment. Maybe a mix of both is better.
Let’s look to maths for how to change the assessment. In a maths course, you learn everything you need to learn during the semester. There are mid semester tests which test very specific skills, and the final exam is very challenging, but with no tricks, and no gotchas. If you learnt everything that was taught, and practiced all of the skills, you should be able to complete the exam. The challenge comes from the application of multiple skills to solve a given problem.
During a maths exam, there’s no lack of direction or lost feeling, and if you’re thinking “I don’t know how to do this” it’s because you didn’t study that topic hard enough, not because there’s a trick to it that you don’t know. You have all of the tools you need, you just need to figure out what specific tools you needs, and in what order, to solve the problem. And you’ve actually learned how to, not just use each tool, but how to know when to use a tool and how to apply it to various problems.
I think assessments’ challenge should come from the application (assuming you learned everything you were meant to), and not from divining some secret knowledge. Sure, part of the challenge of security is enumeration, and finding the vulnerability, and dealing with things you haven’t come across before, and those are skills we need to assess. But let’s keep to things that we’re actually teaching, and (leading me to my next point) actually teach those other skills that we’re assessing.
To change the course we need to actually teach the process, how to approach a problem and know when to use the tools we have, and how to figure out something we don’t know. This comes back to pragmatic, systematic and, most importantly in this context, holistic. We need to teach the big picture, to know why we’re doing what we’re doing, to know what to look for, and how to apply the skills we’re learning.
I know that it’s hard to teach, but by not teaching it we’re not going anywhere so we need to at least make an effort to change. How do I approach a machine I know nothing about? How do I approach a webpage? What am I looking for when I do my enumeration? By considering, and teaching the answers to these questions we can produce actual, useful security professionals, who know what they’re doing and can work in the real world.