Find the right nanodegree program for you.
Start Learning
Want to be a developer? It’s not as simple as you might think.
You could be forgiven for thinking that a discipline based in science, logic, and reason would be easy to be a part of. You’d think that you could start by learning how to program in one of the many disciplines, whether that be mobile, web, API, or embedded development, and that you could then continue the journey—increasing your skills, regularly learning more—for as long as your passion, education, and creativity could sustain. You’d think it’d be that straightforward.
Not quite.
Like many human endeavors, being a developer has a bit of a Freemason-like culture attached to it. It is subject to emotional overreach, irrational arguments, and unfounded, yet long perpetuated, myths. These myths can, if you’re not careful, set you up with false and unrealistic expectations and preconceptions. They attempt to dictate what you need to know—and, sometimes, who you need to be—to be considered a real developer.
It’s difficult enough to know what it takes to make the grade as a developer. Confronted with these myths, the journey can be almost as difficult as deciphering:
Whosoever holds this hammer, if he be worthy, shall possess the power of Thor.
Today, I’m going to explore nine common notions that you’ll likely encounter at one point or another in your development career, and examine whether—or in what way—there’s any truth to them. Which are pure myth? Which only partly so? And which are not, as it happens, myth at all? Let’s separate fact from folk tale.
JavaScript Is Related to Java
Perhaps one of the most prevalent misconceptions, this one is based in the simple similarity of names. No, JavaScript is not Java. One is a scripting language, written specifically for use within HTML pages; the other, a language that compiles to an intermediate form of machine code, called byte code, which is then run inside a JVM (Java Virtual Machine).
Sure, they share a name and some code syntax similarities, as you can see in the following two examples; but otherwise, they’re not the same thing. Here’s a small sample of Java, taken from the SparkJava project:
import static spark.Spark.*; public class HelloWorld { public static void main(String[] args) { get(“/hello", (req, res) -> “Hello World"); } }
And here’s a small sample of JavaScript, specifically using the jQuery library which sits atop it:
$(document).ready(function(){ $('img').click ( function() { // handle the click event on any img element in the page }); });
Curious as to how JavaScript got its name? Here’s an excerpt from an interview with JavaScript creator Brendan Eich:
InfoWorld: As I understand it, JavaScript started out as Mocha, then became LiveScript and then became JavaScript when Netscape and Sun got together. But it actually has nothing to do with Java or not much to do with it, correct?
Eich: That’s right. It was all within six months from May till December (1995) that it was Mocha and then LiveScript. And then in early December, Netscape and Sun did a license agreement and it became JavaScript. And the idea was to make it a complementary scripting language to go with Java, with the compiled language. . . . And we were pushing it as a little brother to Java, as a complementary language like Visual Basic was to C++ in Microsoft’s language families at the time. And it took off.
Technically, JavaScript’s real name is ECMAScript, taken from its standard, ECMA-262. So why is it called JavaScript? In short, none of the major players (think Microsoft and Sun) could agree on a trademark.
Quoting the article further, the reason for creating JavaScript was:
. . . to make something that Web designers, people who may or may not have much programming training, could use to add a little bit of animation or a little bit of smarts to their Web forms and their Web pages.
So whilst Java and JavaScript have similar names, they’re quite different beasts. This one is total myth.
You Have to Begin Programming at a Very Young Age
This is another common misconception, largely fueled by the media. The younger the developers were (or the younger they say they were) when they started, the higher the status, and the greater the skill, popular media seem to attribute them. I’m looking at you, Hollywood.
Where did this idea first come from? A comment in this article from the New Yorker offers an instructive reminder:
In fact, Graham argues, [his company] has always advocated for greater inclusion in the tech industry. “The people who caricature us as being only interested in funding young hotshots forget that when we started, in 2005, young founders were not a privileged group but a marginalized one,” he wrote. “The fact that young founders seem a privileged group now is partly due to our efforts.”
Think back to pre-2000 and the rise of the dot-coms. Who doesn’t remember clichéd stories of investors wanting to meet and speak with the people behind big websites and web-based business, only to find that they were 14-year-old kids, still living at home?
At the time, these young innovators were considered anything but mature, bankable, and worthy of being taken seriously enough to invest in. Fast-forward to today and it’s become almost a rite of passage: you’re considered “old” if you begin over the age of 18. At least, according to the media.
Now some people, such as the late Aaron Swartz, who contributed to such widely used tools as RSS Feeds, Creative Commons, and Reddit, did start out at a young age. He won the ArsDigita Prize—at 13. However, there are many others, such as luminaries Alan Turing, Grace Hopper, and Brian Kernighan, who didn’t. Whilst all were bright, some even exceptional, none of them started in computers at a young age.
Perhaps programming never appealed to you, perhaps you had other commitments (like children!) that prevented you from taking it up, perhaps you just never had the exposure. Or perhaps you had certain preconceptions which stopped you from even considering it.
But perhaps time’s moved on, life’s changed, and now it’s front and center for you. The notion that you have to be young is a myth—nothing more. The right time to start is whenever you’re ready.
I’ve Created a Website, That Means I’m a Web Developer, Right?
Honestly, no. It just means you’ve created a website. Being a web developer is something else entirely. Being a web developer requires becoming proficient in a wide range of skills, and updating and refreshing those skills on a regular basis.
From HTML and CSS, to testing, version control, performance optimization, and deployment, web development is a tough discipline, just like any other. Despite books telling you that you can master web development in just 30 days, it takes more time, more discipline, and more effort.
Don’t get me wrong, if you’ve developed a website, no matter how small, you’ve started on the path to being a web developer. You’ve seen the basics of what’s required, gotten some exposure to the various parts involved, seen how they all fit together. But one site doesn’t a developer make.
If you truly want to be a web developer, you need to study key books and blogs, attend conferences regularly, build a lot of sites, experiment with a wide range of tools and technologies, and get involved in the related communities. Sounds like a lot of work, right? Well, yes. As with any other worthwhile pursuit, any other worthy discipline, in life.
If you’re serious, if you’re passionate, if you’re dedicated, it won’t be a problem for you over the longer term. You’ll love it enough to keep on going. It’s a myth that creating a website makes you a web developer, but it’s pure fact that working at developing will make you a developer.
Real Programmers Code in C, C++, and Assembly Languages
That’s the attitude I had for years. I’ve coded, primarily in PHP, for a long time. If you use C or C++—or Ruby, Python, or Perl—or if you do even a quick Google search, you’ll see PHP written off as a language for script kiddies and hackers. As the developers at MailChimp are ready to acknowledge:
Despite its popularity, PHP is considered by the programming elite, almost without exception, as one of the worst languages currently in use today. The term “good PHP programmer” is considered an oxymoron. Yet it’s the primary language we use here for development, and it’s the only language we use for everything touching the production MailChimp application.
But you notice, they use it. They don’t let those groundless prejudices stop them.
There are memes which, nearly constantly, circulate on the internet about how developers from each language see each other. Here’s a good example from Pinterest on Java and C# developers. Does that mean that these feelings, these beliefs, these preconceptions are true, that they have any basis in reality?
No. It just means that they’re feelings, beliefs, and preconceptions. Now some languages are more challenging, some have a greater barrier to entry, some require more skill and dedication to master than others. C and assembly languages are two good examples. And they have their uses.
…if you’ve developed a website, no matter how small, you’ve started on the path to being a web developer.
But believing that you’re not a real programmer because you’ve not used a specific programming language is, well, ludicrous. Perhaps one language’s style appealed to you more than another’s. Perhaps that specific language was the language required at work. Perhaps you had an environment which was more or less conducive to certain types of programming, such as a Linux laptop.
Don’t feel you’re not capable just because you haven’t developed in a given language. Instead, ask yourself three questions:
- Does your code do what it’s required to do?
- Does your code help users do their job quicker and easier?
- Does your code do what the client asked for?
If your code does all this, does it really matter what language you use? If you produce applications that do what they’re meant to do and that make people’s lives easier, it doesn’t matter what you wrote it in. Leave ego aside. Rate yourself on your merits. And leave myth behind.
HTML + CSS + JavaScript Is Always Essential
Well, quite honestly, it depends. If we’re talking strictly from a web development perspective, it’s nowhere near as clear-cut as it used to be. Once, I’d have said yes, without a doubt, because some years ago websites and applications required full page refreshes—to make even the most minor of changes.
These days, however, the game’s completely changed. It’s now commonplace to use JavaScript libraries and frameworks, such as Backbone.js, Angular.js, and Ember.js, to asynchronously retrieve data from a remote API and dynamically change a section of a page. Applications themselves are a lot more sophisticated (some might say complicated) than before, requiring many more moving parts. Why? Because all three technologies have advanced significantly. Developers are continually pushing the envelope, creating richer applications, which in turn leads users to expect more from the apps they use. As a result, it’s a lot more challenging to master any one of these technologies, let alone all three.
Get out there and build projects with technologies you want to learn—as soon as you can.
So it’s a qualified yes. A working knowledge of the fundamentals of each of these technologies is still essential. Any front-end job will, at one time or another, likely require a combination of interface design (HTML), interface styling (CSS), and interactivity (JavaScript). Whether you’re working for a startup, developing a custom application, or customising third-party software (such as Adobe Commerce (previously Magento), WordPress, or Drupal), you need to know them.
The two important questions then are:
- How to best manage the process of learning the essential information
- How to keep improving
A full discussion of either of these two is worthy of a complete article in itself. So here are my two key recommendations for how to grow your technical knowledge:
- Information aggregation
- Applied knowledge
Information Aggregation
As there’s so much information available, you need to find and collate the best sources, and make them readily available. To do that, I suggest using a combination of RSS feeds and Twitter lists, so that you don’t have to spend time going to a host of different sites, or wading through countless tweets, manually, to find the information you need. With these two approaches, the information will be collated and organised for you: all you have to do is pick what you want to know.
I’ve focused on Twitter, of the available social networks, because it’s where the techies are. As a result, it’s the place you’ll find out what they’re doing, what they’re working on. It’s also the place to become a part of a community of your peers, where you can directly interact with them, in real time. If you haven’t already, start building Twitter lists of people, companies, and products in specific domains of knowledge you’re keen to learn, or keen to improve on. And start aggregating quality resources, such as A List Apart, CSS-Tricks, and DailyJS.
Applied Knowledge
It should go without saying why you need applied knowledge. It’s one thing to read about concepts, tools, and technologies; it’s quite another to use and apply them. Get out there and build projects with technologies you want to learn—as soon as you can. Read the code samples that come with libraries, tools, and packages. Then build on them, change things, try things in a different way than what’s suggested. Get off the beaten track. This way, you’ll know how technologies work much more intimately.
But don’t learn alone. Learn with others. Use tools such as GitHub, Gist, and JSFiddle. Write code, share code, and ask others to comment on your code. Contribute to existing projects by submitting pull requests. By doing so, you’re learning for real. It’s OK to hack on projects which no one else will see. But when you have to put your work out there in the public domain, where anyone can see it, both the way you approach it and the way you write it become much more focused, much more considered, even much more professional. Plus, the feedback you receive will help you learn so much quicker than learning on your own.
So whilst it’s a wildly different landscape today than it used to be, continuing to learn HTML, CSS, and JavaScript—even if you don’t master them—is no myth.
I Need to Know Every Framework Ever Released
This belief takes it as writ that you have to learn a framework, without taking into account the advantages of learning frameworks over rolling your own code. So before you start down the path of acquiring even that first framework, take a moment to consider why frameworks exist—what it is they offer.
All done?
It’s called: efficiency. If you’re still thinking you have to learn each and every framework, then just consider what it would take to learn all the available frameworks in even one language. In PHP, for example, I can point to at least 20, and those are just the more common, the more notable ones. Do you really want to learn them all? Do you really have that much time, desire, and demand? Does it buy you anything?
Go for the framework whose scope aligns with your task.
There are frameworks for nearly every type of development you can imagine, whether that’s micro sites, short-term applications, or more “enterprise-y” development. Before you consider learning a framework, instead ask yourself what type of application you are building. Tailor your selections to that. And don’t swing in the other direction to try to use one framework as a one-size-fits-all solution either. Sure, you could use Zend Framework for everything, but for a microsite, it’d likely be overkill. Conversely, you could use SlimPHP for everything, but you’d likely be reinventing the wheel and writing a lot of duplicate code if you were aiming for more enterprise applications. Go for the framework whose scope aligns with your task.
Another reason not to try to learn them all? Code duplication. Many frameworks are either targeted at exactly the same space or duplicate a lot of the same concepts: think SlimPHP, Silex, and Lumen. Several others are an implementation, or port, of Ruby’s Sinatra framework. Sure, they do things slightly differently, but overall they’re quite similar. If you learn one, you’ve gone a long way to learning the others.
Finally, a lot of frameworks aren’t an all or nothing arrangement. A lot of them, such as Symfony 2 and Zend Framework 2, are component libraries, where you can pick and choose the parts you need, and skip the ones you don’t.
This one is pure myth—without sufficient appreciation of the level of effort that would be required, and without any solid justification for that effort.
It’s More Important to Just Pick a Framework (or Language) and Start
As a corollary to not learning each and every framework in existence, this one has more than a grain of truth to it. Take frameworks for starters. Frameworks are normally designed to handle a wide variety of standard, repeatable functionality, which would otherwise have to be implemented by hand, over and over and over again.
For example, in the web development world, most frameworks take care of functionality like routing, dispatching, creating requests, handling responses, forms, data source interaction, input validation, and so on.
So, yes, this can be a good approach to take. However, it’s not a case of just pick any old framework and begin learning. There are a wide variety of frameworks, some of them adhering to widely different philosophies. For example, there’s The MicroPHP Manifesto. Here are four of the key points:
- I think PHP is complicated enough.
- I like building small things with simple purposes.
- I need to justify every piece of code I add to a project.
- I want to write code that is easily understood.
Focusing on the third point, there are some strongly held beliefs as to what a framework, a true framework, should and shouldn’t be. Some people argue, for example, that no matter how helpful a framework is, there’s no place for such features as database interaction. Personally, I quite like having it available, but I can understand the objections.
You’ll see these beliefs, these philosophies, reflected both in the design decisions and in the feature set the framework supports, or doesn’t. For example, SlimPHP is lean and trim, providing just the bare essentials required to making a working web application. Contrast this with Lumen and Silex, both of which, although they are microframeworks, also provide data source interaction, encryption, and more as part of their core feature sets.
Keeping this in mind, one of the first questions to consider is, will you be working with smaller or larger applications? You have to pick the framework that’s right for your needs.
Using the web as a point of reference, my personal recommendation would be to start with one of the available microframeworks and learn the core of how a web application works. Write a number of different applications with the framework, which will expose you to different aspects of the feature set and to how applications work.
Once you feel comfortable with the process, progress to a full-stack framework, and see what it’s like having everything at your disposal. When you’ve built a few working applications with the full-stack framework, start swapping out some of the framework’s libraries for third-party libraries, so that you don’t just learn to develop as the framework dictates.
Just don’t pick one blindly, or use it simply because “everyone else is.”
That’s frameworks. Now let’s consider languages. Pick your language with thought and care, based on what you’re developing. Web-scale application? Script to support a deployment task? Front-end GUI layer? And so on. Whilst some languages are popular, even so far as to be de facto standards (think PHP for web), that doesn’t make them necessarily the right choice for each task.
There are several important points to consider before choosing a language, points which help determine whether it’ll be right both for your team and for your project. Consider, for example, these:
- What kind of community sits behind the language?
- Is that community active, thriving, supportive? Or is it small, perhaps faltering or even exclusive?
- What’s the availability of support resources such as conferences, IRC channels, blogs, books, podcasts, and people to talk to?
- Is the language active, with regular commits, pull requests, and comments? Or is it rapidly becoming dormant?
- Is it well-supported with tools such as deployment tools, package managers, code analyzers?
I rave about PHP in large part because of the community behind it, conferences like PHP UK Conference, and podcasts such as PHP Roundtable and /dev/hell. PHP’s not, technically, the best constructed language. But factors such as these, and a number of others, more than make up for its shortcomings.
These aren’t the only factors, of course. You must look at the whole picture. Let’s say that you’ve found a language which isn’t as popular and which doesn’t have as many resources to support it, but one which you believe capable and perhaps the right fit for your team. Let’s take Go for example. It’s nowhere near as popular, nor as widely supported, as PHP (or Ruby for that matter). However, what it lacks in size and age, it’s definitely making up for in both rapid community growth and exemplary design. In fact, it’s the current programming flavour du jour. Ruby’s long since been supplanted by it as the language all the cool kids and hipsters want to be seen using.
But don’t hold that against it. Go’s an excellently designed language: simple, efficient, easy to understand, and easy to develop with. More and more projects are being created with it, the language itself comes with exceptional tooling support, and there are some significant blogs and technical resources for it, including:
So just because a language is younger, smaller, and less widely known, doesn’t mean it’s not the right choice.
And it’s all about that choice. There’s nothing inherently wrong with frameworks or languages, and you can learn a lot by just using them. Just don’t pick one blindly, or use it simply because “everyone else is.” Do your due diligence, and make an informed decision. When you understand why you’re using them and the benefits they provide, you’ll likely make better decisions, ones which stand the test of time. And there’s nothing mythical about that.
You Need to Be a Genius, like Tony Stark
Akin to the tale of programming young, there’s often this misconception that you have to be a genius, rather like Tony Stark in the current Ironman and Avengers franchises. Nothing could be further from the truth.
I’m not saying there are no genius programmers, far from it. However, most of the programmers I’ve met—and I’ve worked with some very gifted, talented, and hard-working folks, across a range of industries and languages—are normal people, people who keep getting better and better each and every day because they work at it.
These are people who didn’t start out with Einstein-like IQs. No, they often found they had a love of computers and computer science, of problem solving, even of art (yes, computer programming is something of an art). And that passion fueled their continued learning.
I’d be willing to bet that if you got talking to some of the top programmers you admire, the majority of them would quickly dispel this myth simply by how they approach the world. These aren’t people who walk around with a hubristic swagger, ready to whip out a pen and sign autographs at a moment’s notice because they’re so bloody brilliant. On the contrary, they’re often rather self-deprecating, all too ready to give credit to one or more other people, and all to willing to share with you the mistakes they make on a regular basis.
Take this talk from Django project lead Jacob Kaplan Moss. Besides being a great talk, it’s instructive to hear Jacob explain why people often harbor the false belief that well-known figures must be geniuses. Oftentimes it’s not them, it’s us. We hold this misguided feeling that such leaders must, naturally, be better than the rest of us. The reality is far more complex.
The best are often the best precisely because they surround themselves with great people, people who have the skills they themselves lack, whether in accounting, marketing, debugging, testing, architecture, or any number of other areas.
Richard Branson, one of the greats of business, perhaps said it best:
The key enterprising skills I used when first starting out are the very same ones I use today: the art of delegation, risk-taking, surrounding yourself with a great team and working on projects you really believe in.
Bear these sobering thoughts in mind when you come across this myth in the wild.
Men Make Better Programmers
I’ll finish off with my personal favorite, one which has no basis whatsoever in reality. Wherever you go, it’s true, you’ll usually find that the tech sector is heavily biased towards men. Whether it’s programming or some other subdiscipline, women are often in the minority.
…those among us who are men need to do all we can to encourage like-minded women—our sisters, girlfriends, wives, daughters—to enter the field.
Go to just about any major university and do a quick count of the number of women in computer science courses. Have a look at the number of programmers either in tech startups or with more established companies. Think too about the developers you know. How many of them are women? Odds are that the majority will be men.
According to an article in The New York Times, in the US alone, on average:
- Only about 12 percent of people in tech are women
- Just 18 percent of computer science graduates are women
Have you ever wondered why such statistics exist? No? If you’re aware of the foundational roles women played in tech, these stats might—and rightly should—alarm you. Here are some interesting facts you might not be aware of:
- Six women programmed one of the most famous computers in history—the 30-ton Eniac—for the United States Army during World War II.
- Grace Hopper was one of the first programmers of the Harvard Mark I computer in 1944, and she invented the first compiler.
- Ada Lovelace is considered the first computer programmer.
- Jean E Sammet developed the first computer language, FORMAC, at IBM.
- Sister Mary Kenneth Keller was part of the team which developed BASIC.
This is just a handful of the many women who’ve made great contributions to technology, but you’ve likely not heard of any of them, save Ada Lovelace.
There are a wide number of theories for why this is so, for why more women don’t go into the sciences in general and computer science in particular, but a lot of it has to do with our educational system, and how girls begin to become discouraged at about middle school in math and the sciences, and some of it may have to do with hiring practices.
This one’s a myth, but so as not to perpetuate this disconnect from reality, those among us who are men need to do all we can to encourage like-minded women—our sisters, girlfriends, wives, daughters—to enter the field. Technologically minded women, for their part, should continue to dispel this unfortunate myth, one person at a time.
The Bottom Line
And that’s nine commonly held, often counterproductive, beliefs you’ll encounter in one shape or another throughout your career.
Don’t let myths (or near-myths) such as these stop you from becoming the best developer you can. Don’t let them stop you from embarking on that path, or from further honing your skills at any point along that path.
You don’t need to have a genius-level IQ, to have started at any specific age, or to know everything to be a developer. Perhaps you started later in life, perhaps you started younger. Perhaps you had more exposure or opportunity, perhaps you had less. Perhaps you come from a poorer community, perhaps you come from a richer one. It doesn’t matter. It also doesn’t matter whether you’re a hobbyist programmer or a professional one.
What matters is that you start, and that you keep on going—and learning. It doesn’t take much to start, and the rewards can be so great. Don’t let baseless myths stop you from being a part of this great field of human endeavor.
Become a front end web developer.
Start Learning