The question of how important programming theory is to real-world software development is a divisive issue within the developer community, and you don’t have to go far to find people arguing on both sides.

A lot of big tech companies still base hiring decisions around a candidate’s understanding of programming fundamentals, much to the chagrin of developers who have learnt to code but do not hold a computer science degree, where much of this is taught.

Online coding courses and so-called bootcamps have exploded in popularity in recent years among those looking for a fasttrack into software development. These intensive courses usually focus on a specific programming language or platform, which students will be taught to use over the course of several weeks.

While they might be a convenient choice for someone looking for a mid-life career shift, or for someone who doesn’t have the time or money to sit a two-to-four-year computer science degree, online courses rarely leave students with an understanding of programming fundamentals. This can come as a nasty surprise when it shows up in the interview process.

“If a person has spent their career learning programming in practice – which we could say it’s how most people learn to program nowadays, including me – these theoretical questions can be extremely difficult to answer, giving a feeling that these processes are ineffective and even unfair to a point,” says Bruno Rocha, an iOS developer and writer for programming blog SwiftRocks.

Rocha recently wrote about the topic of programming fundamentals and how important they are to a successful career in software development.

SEE: The best programming languages to learn–and the worst (TechRepublic Premium)

He points out that, while the traditional tech companies have continued with their view of programming as a generic concept, newer companies have more practical and focused platform needs, which is exactly the type of job that bootcamp-style courses are designed to fill.

“In short, the interview process of these companies is considerably different from the average one, with the former being more focused on theory and the latter being focused more on practical concepts.”

A lot of the anger towards tech companies who demand an understanding of computer science stems from the fact that it is largely seen as unnecessary for the types of tasks a coder will actually be expected to perform as part of their job.

The short explanation is because theoretical knowledge is unnecessary for most jobs nowadays, says Rocha. “Although they provide a great boost to your programming ability, it’s safe to say that from a career necessity standpoint, one does not need to master programming in a generic way if their job is to code for a specific platform, like web or iOS.”

Of course, the importance of this knowledge also depends on where or for whom an aspiring developer wants to work.

A common misconception is that the work performed by software giants is the same as for the average tech company, but this is not true, says Rocha. “Even though the job might technically be the same, these companies have considerably different needs and objectives, and I think it’s very important to consider and understand these differences when setting your career goals,” he adds.

Tom Crick, professor of digital education and policy at Swansea University, says people are becoming increasingly sceptical of tech giants’ ‘brutal’ technical interview processes , which aren’t necessarily an accurate means of determining a candidate’s core competencies.

“It’s quite attritional,” Crick says. “Some people like it as a badge of honor. But actually, I think if we are talking about that transition from a software engineering graduate into their first position, what are the expectations for their skills and their knowledge and understanding, and what they could actually demonstrate?”

Crick believes university programmes have a part to play in better preparing graduate developers for employment, noting that most big companies will expect a candidate to hold “a good degree from a good university”.

In which case, current software engineering programmes may need something of an overhaul, with Crick noting that many UK universities have simply tweaked their computer science degree to accommodate more software-focused syllabi.

“The software engineering program has developed as ‘let’s tweak the computer science degree and add a bit more software-type stuff’,” he says.

SEE: C++ programming language: How it became the foundation for everything, and what’s next (free PDF) (TechRepublic)

“Actually, you’re starting to see, particularly because of the demand for people to program across a range of different sectors…the kind of breadth of knowledge and expertise goes all the way from, you need programming skills, you need some sort of formal kind of theoretical knowledge, but also you need the ability to understand what it means for designing software for user-centred design, and understand how that can be used in a variety of different contexts.”

When it comes to practical versus theoretical knowledge, Crick says it isn’t a case of either or – neither is comparable to the other in terms of the experience they offer, and each have their individual merits.

“I think they can be complimentary. Doing a bootcamp can be a really rapid immersion into say, if you wanted to learn Angular and the hot JavaScript framework that everyone seems to use in FinTech. Then, I can see that going on an Angular bootcamp would be really good, because it’d be much more industry-focused,” he says.

However, Crick also believes that, much like any technical discipline, mastering software engineering requires theoretical knowledge – knowledge that developers won’t gain in a intensive online course. “I get quite frustrated when people say you just need to be able to program and you don’t need to do all that theory stuff, because the theory stuff is also quite important to understand mistakes that have been made in software for years and years,” he says.

“There’s a pragmatism around [the fact that] you cannot be an expert after doing an eight-week or a three-month program. It’s just the reality; it’s just impossible.”

There’s also the issue of hands-on experience. An intensive driving course will teach new drivers the practical skills needed to manoeuvre a car in a few short days, but it won’t give them the sustained experience that turns people into confident and capable drivers. The same is true for coding.

SEE: How to build a successful developer career (free PDF) (TechRepublic)

“You need to develop those competencies, those behaviors and practices, and clearly you develop some of those at university, but you also have to develop those in industry,” says Crick.

“I don’t think you could develop that over two weeks, eight weeks or three months. And I also don’t think you entirely develop that over the lifetime of an undergraduate degree. It’s the apprenticeship and the development and then you have to do that in the real world, in industry, too.”

The rapid rise of smartphone technology some 15 years ago led to a shortage of developers, which coding bootcamps sprang up to fill by enabling those without a college degree to get into the industry.

But with more new developers coming through these non-traditional pipelines, there is a risk of a mismatch between their own expectations and those of hiring companies, with Crick noting that coding bootcamps may make students over-confident about their capabilities.

“That’s not to say that the computer science degree is a prerequisite for going to be a software engineer, because actually there are lot of people who are software engineers haven’t done a computer science degree,” he adds.

“But if you look across people with, say, a STEM science or a sort of STEM degree background, you’ll see a lot of engineers, mathematicians and scientists who are also very good at programming, because they have that strong technical foundation about how to think about solving problems.”