Login Register

Why do front-end developers make significantly less than back-end software engineers?

I have worked in the software industry for many years. I have three children in the industry, one is a graphic designer who also does front end development, one does security and the third is a ‘full stack’ developer. All hold Honours degrees. Their ages are up to 10 years apart, in the order given, so direct comparisons are difficult. I do not have any idea of their actual incomes.

The designer, I believe, is the least highly paid. The other two are exceptionally well paid. I can gauge from their lifestyles that there is a difference.

Some factors, I believe include

  • Risk to the company if the job is not absolutely correct
  • Risk to the company from external threat
  • Perceived and actual complexity
  • Perceived value added
  • Level of training and ONGOING training
  • Risk to the worker of technology migration making their skills obsolete
  • Complexity of the day to day job
  • Required in depth knowledge of the business and business processes to effectively do the job
  • Supply and demand within the skills pool
  • Hard skill vs soft skill. Hard skills can be quantitatively measured, soft skills are more difficult. It is easier to sell a $1M bulldozer than a $1M app or $1M web page design. Yet a web app I know of made $100K for a client in the first day! It continues to do so. The bulldozer is unlikely to ever succeed at this level.
  • Personality - some are focused on the money, others are more driven by other factors. This motivated their choice of career and job selection.
  • Job Market - The job market changed as each of them entered and moved in it. “Full Stack Developer” was not an option when my first child entered the market. Front end and back end were not even concepts when I entered the field.
  • Technology shifts - as per Job market
  • Bleeding edge vs established technology vs obsolete . e.g.
    • COBOL programmers were first poorly paid in the 1950s and early 60s as there was not yet a demand for programming skills.
    • Then better paid in the 70s and 80s as more mainframe computers were introduced. The skills pool was still developing as training had not kept up with demand.
    • By the mid 1990s COBOL became poorly paid in comparison to other newer languages. There was an oversupply as companies and new development shifted to microprocessors and the Internet and COBOL lost favour. The requirement to maintain legacy systems kept the COBOL market buoyant.
    • In the late 1990s there was a spike in demand and pay when companies addressed unexpected Y2K related risk in what were, by now, largely legacy systems.
    • In the 2010s, COBOL once again became exceptionally well paid as the skills pool diminished with many of the remaining programmers retiring or taking on other jobs. Legacy systems still needed maintenance due to changing business requirements.
    • Phasing out some of these vital but legacy systems will prevent problems for their owners in the future if they are unable to migrate away from them. The remaining COBOL systems are large, complex and embed a huge amount of business knowledge, little of which can be recovered from the code or business users. This is largely responsible for the systems not having been replaced and retired. The quality of programmer required to make the necessary changes would, in general, be trained in and know other languages that offer a better career path at this time.

A prof of Psychiatry who had a special interest in HR once told me “Software development is the single most difficult job there is. PERIOD.”

Very few are able to produce good quality code due to the intellectual challenges it presents. Even fewer stick to programming and keep updated as the market shifts.