The 5 Oh-so-Breakable Myths About Learning Programming
The human is a curious creature. Unfortunately, not every subject succumbs to exploration, and deduction does not always lead to the right answer. Therefore, it’s only natural that various misconceptions will quickly surround anything that is not universally understood. You can hear myths about everything. Black holes. Sleepwalking. The memory of a goldfish.
While not nearly as mysterious, the profession of a programmer has attracted a lot of fiction and stereotypes as well. To a seasoned developer, it’s merely a joke. However, it can easily misguide someone just looking to take up programming, be it as a creative hobby or a career choice. We have selected the five most popular myths about learning programming to debunk and leave behind as you step towards the career of your dreams.
1. To Be a Good Developer, It’s Crucial to Be Great at Math
This is probably the most common one. Where do people keep getting this from? Truth be told, this myth might be rooted in the fact that the first computers ever made were actually not much more than humongous calculators. Naturally, the pioneers of computer engineering were also mathematicians because the profession of a programmer simply didn’t exist yet. However, this was decades ago and has nothing to do with the way we understand computer programming now.
All the math you need for basic programming is algebra that you learned at school, logical thinking, and the ability to recognize patterns. Data science and game-making do require some knowledge in more complicated topics, such as trigonometry. However, these are exceptions that most developers don’t ever face in their daily work. If you’re more into web development or writing application software, you will be fine, even if you were never a math whiz.
2. No One Will Bother With Your Résumé if You Don’t Finish College
Another one from the mixtape of classics. Yes, it used to be true when college or grad school was pretty much the only way to learn coding. That ship has sailed, though – as the demand for professional developers grew, so did the number of ways to become one. Apart from official study programs, we have books, bootcamps, mentorship programs, and online courses.
Learning computer programming online just makes sense. The supply is impressive. You can choose from free and paid courses and explore various topics without worrying about time or location restrictions. With systems like BitDegree or Codecademy offering a gamified and interactive learning experience, you can even have fun while sharpening your skills. Unlike university programs and published books, online material is quick and easy to update, so you can always be sure you’re learning about the most modern technology.
Most of the online course platforms also issue certificates that you may add to your résumé along with the finished projects. Truth be told, most employers are actually much more interested in your portfolio than your diploma. Roll up your sleeves and get to it!
3. Programmers Are Antisocial Weirdos
Ah, yes. The classic trope, cultivated carefully by the popular culture. Shows like Mr. Robot, Silicon Valley, or The IT Crowd never fail to portray a professional developer as an introvert who simply cannot comprehend talking to other human beings (and taking any proper care of themselves). Bad posture, messy desk, and the same jeans their mothers bought them in sophomore year. Maybe a figurine of a superhero somewhere. Yup, that’s our IT guy right there.
Just like the myth #1, this one has risen from the old days. Believe it or not, during the fifties and sixties, employers deliberately preferred antisocial guys when choosing new employees. Why? Well, programming is a technical job that requires a lot of focus, and someone who’s too social might have been deemed as easily distracted.
However, a psychology professor Timo Gnabs from Osnabrück University has done an extensive study, proving that the ability to code has absolutely no relation to neuroticism or disagreeableness – in fact, it was associated with openness. Another thing is, as the industry grew, the companies and projects got more ambitious, too. If you’re not a freelancer, you don’t often get to do something alone from start to finish. Nowadays, it is expected from a developer to be able to work with a team.
4. Women Have No Place in Tech
This is the one myth that doesn’t make you raise your eyebrows, not comprehending how someone could have thought of this in the first place. It’s no secret that women are underrepresented in the IT industry: only one in five IT bachelor degree recipients is female, and they only take up around a quarter of the computing workforce.
The truth is, women were actually the pioneers of software programming. The first algorithm for an early computing machine was also written by a woman – Ada Lovelace, born in 1815. Believe it or not, the gender scales only began to tip by the early 1970s. As the world realized the significance of programming, it was no longer deemed a simple underpaid job – and men wanted in. Even computer manufacturers started aiming their marketing towards boys and men.
The fact is, there’s absolutely no reason programming abilities could have any correlation with gender. Adafruit Industries was founded by Limor Fried, the CEO of IBM is a female – and so are the CEO of Youtube and COO of Facebook. There’s also a ton of programs aimed at narrowing the gender gap in the industry, too. Don’t play your abilities down!
5.You Need to Choose and Learn the Best Language
This is a common one as well. An aspiring coder reviews some job listings, notices the demand for, say, Python or Java developers, and decides that to become a professional they now need to master said language – and that one language alone.
The problem is, there is no best language. While it’s true some are more popular than others, it is not the same as being better. The main difference lies in purpose, so decide what you want to create and go from there. Java is the official language for Android mobile development, Python is an excellent option for machine learning, PHP works well for server-side… You get the gist.
So, What’s Actually Not a Myth?
Among a bunch of misconceptions, there are some common truths about programming. Yes, you really can start at any age: there are thousands of introductory programs for kids, as well as adults wishing to make a career switch later in life. Yes, it does mean a lifetime of learning. Yes, the demand for skilled programmers is still growing and doesn’t plan to stop anytime soon.
Yes, the most important thing is to start.
Which Myth Have You Believed In?
Questions & Answers
© 2019 Simon Adams