ACM CareerNews for Tuesday, July 19, 2022
ACM CareerNews is intended as an objective career news digest for busy IT professionals. Views expressed are not necessarily those of ACM. To send comments, please write to email@example.com
Volume 18, Issue 14, July 19, 2022
Despite growing concerns about layoffs in the tech sector, the unemployment rate for tech occupations actually dipped to 1.8 percent in June, a notable decline from 2.1 percent in May. The tech unemployment rate is also well below the overall national unemployment rate of 3.6 percent. According to the latest data from the U.S. Bureau of Labor Statistics, employers posted more than 500,000 open tech jobs in June, a year-over-year increase of approximately 62 percent. The stronger than expected job gains reaffirm the critical role of tech across every sector and every business in the economy.
The surprisingly good numbers on IT employment also highlight the limitations in projecting company-specific hiring practices to the broader tech workforce. In other words, some of the stories about high-profile companies slowing or freezing hiring are not necessarily indicative of the tech industry as a whole. Although some companies are facing significant headwinds, organizations everywhere still need lots of technologists for everything from running websites to ensuring the tech stack is secure from internal and external threats. Moreover, the need for technologists extends well beyond the tech industry itself. Virtually every corner of the economy, from agriculture to manufacturing, needs technologists. In short, given the general reliance on data analytics and cloud-based applications, virtually every organization is experiencing significant demand for tech workers.
Across the board, salaries in tech professions have risen 6.9% in the past year, with some positions (including web developer, technical support engineer, database administrator and data analyst) leading the pack with double-digit growth. According to experts, the spike in salaries is rooted in basic supply and demand. IT upgrades and build-outs put on hold during the pandemic resumed in 2021. Companies committed to larger, more aggressive investments in tech (especially in areas such as AI, analytics, automation and cloud) in order to overcome disruptions of the previous year. Meanwhile, the supply of talent to implement, operate and manage these technologies did not keep up with demand.
While the hiring surge may level off over the next year, salaries are expected to remain high. Wages for tech workers were already on the rise, according to U.S. Bureau of Labor Statistics data. The median wage in tech rose to $94,058 in 2020 from $86,852 in 2019, an increase of 8%. Of course, some of that increase is due to organic growth rooted in modernization efforts underway pre-pandemic. However, even in the wake of pandemic, job and salary growth remain strong. Postings for open tech positions were up 62% from June of last year. There are simply more companies hiring, and there is more technology used across business functions. Overall, that translates to a need for more workers.
Among all the tech job areas, cybersecurity is growing the fastest, according to tech training provider Coding Dojo, which recently released its list of the Top 10 hottest tech jobs for 2022. To create its list of the ten most in-demand tech jobs for 2022, Coding Dojo first looked at the report on the 50 Best Jobs in America for 2022 by Glassdoor and then captured each role that fit its definition of a tech job. The company then analyzed the number of tech job listings on Indeed and the projected growth rate for each one as given by the U.S. Bureau of Labor Statistics. Finally, Coding Dojo combined all this data to generate its Top Ten list.
With cybersecurity jobs in hot demand, the role of information security engineer came in at the top of the list. This job title refers to a mid- to senior-level cybersecurity professional that builds and designs security systems for networks and applications. To qualify for such a role, you should possess years of experience in cybersecurity backed by a knowledge of Linux, Unix and Java. The average salary for an information security engineer is $119,000, according to Glassdoor. Coming in second place on the list was the job title of full-stack engineer. A full-stack engineer is a versatile type of programmer who can code across multiple stacks, such as front-end clients and back-end servers. Demand is high here because employers are always eager to hire developers who are well-rounded and can do it all. People looking to pursue this specialty should start learning more stacks to increase their versatility.
As Cyber Talent Demand Heats Up Hiring Managers Should Shift Expectations
Cybersecurity Dive, July 12
Based on recent data analysis from CyberSeek, the job market for cybersecurity talent is on a record-setting pace. U.S. employers posted roughly 715,000 cybersecurity roles in the 12-month period ending in April 2022. Demand for cybersecurity jobs increased 43% over that 12-month period, compared to 18% for the rest of the job market. The growth rate for cybersecurity talent is some of the fastest ever recorded. For example, in the first four months of 2022, each month broke the previous monthly record for the most jobs tracked. High demand has come at a cost, though: Cybersecurity jobs are taking longer to fill, while cybersecurity salaries are increasing at a faster rate than salaries in the overall tech market.
Many companies cite a talent gap for their inability to fill cybersecurity roles, but a big part of the problem may be that hiring managers are looking for more than they can find. According to the latest figures, more than 60% of companies have unfilled cybersecurity positions and understaffed teams. The top skills gap, cited by more than half of cybersecurity professionals surveyed, is related to soft skills such as problem solving, critical thinking, and communication. The top factor used to determine whether a candidate is qualified is usually prior hands-on cybersecurity experience, followed by credentials. This leads to a unique problem: There are almost 1 million open jobs but no one is willing to hire junior people. In an ever-expanding cyberthreat landscape, and with increased scrutiny of cybersecurity practices among government entities as well as customers, few companies are willing to put someone with just a few months of experience in charge of protecting valuable digital assets.
The 5 Things Gen Z Is Looking For in a Job and Career
Entrepreneur.com, July 12
Just as Millennials changed workplace culture forever, the same phenomenon appears to be happening with Gen Z, which has officially taken the place of Millennials as the largest generation. Generation Z includes people born after 1996 and currently encompasses 32% of the global population and 11% of the workforce. And these numbers are rapidly growing. According to some estimates, by the end of 2022, the percentage of members of Gen Z in the global workforce will be 24%. By 2030, it will hit 30%. This generation has the potential to redefine what it means to work a typical 9-5 corporate job, with the implication that organizations may have to change how they recruit, hire and retain these employers.
For members of Gen Z, purpose in the work that they do matters. Rather than purely chasing after the highest salary, members of Gen Z value the mission statement and impact of the places where they work. This could be a software company aiming to protect the digital privacy of individual users, or a consumer-facing tech company attempting to reduce its overall carbon footprint and impact on the environment. According to a recent report, 42% of the members of Gen Z would choose to work at a company whose values they align with over one that offers a higher paycheck. Besides working to earn a living, they also want to feel like they are enacting real change in their fields.
6 Ways to Make Performance Reviews More Fair
Harvard Business Review, July 12
Companies simply can not afford to keep a performance review system in place that is biased against certain employees, misrepresents their skills and abilities, or prompts them to seek jobs elsewhere. As you may have already encountered in your tech career, bias can be a factor in the performance review process. Because the criteria for evaluation are so often vague and open-ended, it is dangerously easy for patterns of bias to creep into the process and for managers to be guided by implicit biases. Most obviously, different standards of behavior can apply to different groups of people and managerial feedback can reflect or reinforce negative stereotypes. All of these factors can lead to inappropriate assessments of performance, which in turn can prompt talented employees to leave.
The good news is that there are six behavioral nudges (four for managers and two for employees) that could potentially make the performance review process better. For example, about a week before employee performance reviews, senior managers could guide groups of six to eight junior managers through multiple hypothetical assessment scenarios. These scenarios can be based on the profiles of real employees. They can have the managers review the files, make their own assessments, and then have the groups share and discuss their reviews. To take the exercise up a notch, they could have people compile their assessments before the group discussion, and present how they compare to one another on a whiteboard or slide deck for all to see during the exercise. This exercise might bring to light outside factors that might be affecting how employees have performed, and what sorts of behavioral contexts affect how managers assess performance.
The Most Important Soft Skills for Tech Jobs
ZDNet, July 14
While basic coding and programming skills are paramount for landing just about any computer science job, you will also need soft skills for the most prestigious and highest-paying tech jobs. Soft skills, also called people skills, enable workers to work and communicate effectively with others. Essential soft skills for tech jobs include teamwork, adaptability, and communication. Each skill can help you better work collaboratively, use your intellectual and creative strengths, and maintain mental discipline. Computer science degrees and other tech-related college programs often teach these skills alongside hard skills. But you can also cultivate these soft skills on your own time via online courses or independent projects.
Adaptability is a key trait to master if you want to advance in your tech career. Late-career tech professionals will attest that an ability to learn is more important than mastery of specific computer programming languages, software or platforms. Most tech jobs do not expect you to know everything. Instead, they want you to show learning agility, or the ability to pick up new skills quickly. Adaptability ensures that you can navigate novel problems effectively and innovate. You can increase your learning agility and become more adaptable by maintaining a lifelong commitment to learning in your downtime. Consume tech podcasts and news and pick up new skills for fun. You never know when they might become useful.
Here Are the Best and Worst Countries for Remote Work
ITPro Today, July 13
In the aftermath of the pandemic, patterns of remote work appear to be changing. Remote work is shifting away from low-cost, far-flung destinations in favor of cities and regions closer to the organizations and corporations dependent on them. In fact, when it comes to the best remote working destinations, European countries dominate. Germany is the top-ranked country for remote working, followed by Denmark and the U.S., according to cybersecurity software firm NordLayer. European countries make up the remainder of the top 10, with the exception of Singapore, which ranks ninth.
Ever since the beginning of COVID-19, remote or hybrid work has become inevitable even in those companies that previously preached the importance of face-to-face interactions. At the same time, the changing nature of remote work has impacted the relative importance of key criteria. The ranking of 66 countries is based on key criteria such as cybersecurity resources; economic and social conditions; health care options; English proficiency; digital and physical infrastructure; and COVID-19 response. For cybersecurity, European countries, particularly smaller ones like Lithuania, Estonia and Slovakia, are safest, according to NordLayer.
Building a Practical Quantum Computer
Communications of the ACM, July 2022
There are growing signs that a career in quantum computing might actually one day become a possibility. The first step, of course, is building a practical quantum computer. Researchers have speculated about quantum computation for decades, but it is only recently that they have seen steady experimental advances, as well as theoretical proofs that quantum computers can efficiently do things that classical computing devices cannot. The field is attracting billions of dollars from governmental research agencies and technology giants, as well as startups. Conventional companies also are exploring the potential impact of quantum computing. Despite this excitement, including successful sensing devices, quantum computing has not made practical contributions.
Continued large national investments in quantum computing are motivated in part by concerns about falling behind other countries. For example, government security organizations worry that factorization capability could break secure communications. However, experts note that certain types of encryption can still evade any reasonable quantum machine, so those concerns may be overblown. They also note that if cost-effective applications are slow to arrive, it might erode the funding for quantum computing. One possible scenario is a prolonged downturn for the field, like that which limited artificial intelligence enthusiasm in past decades. It is always possible that things will not work out. However, there are enough smart people and there is enough money being invested in quantum computing that many experts are more optimistic than not. With so many distinct approaches to quantum bits, all it takes is one of them to work.
Debunked Software Theories
ACM Ubiquity, June 2022
Why do some software theories succeed, while others fail? This question can be answered using the scientific principle known as falsifiability, which was first introduced in 1934 by philosopher Karl Popper in his book The Logic of Scientific Discovery. Popper maintained that scientific claims (e.g. propositions, hypotheses, theories, models, assumptions) should be posed in such a way that they are open to being proved false when tested. The reason is that universally quantified claims can usually not be verified; they can only be falsified. Falsifiability is used to weed out incorrect scientific claims. For a claim to be accepted, it must withstand falsifiability attempts. With that in mind, the article takes a closer look at three software theories whose claims were tested with experiments.
The first software theory to be debunked says multi-version programming produces far fewer errors than single-version programs. Its falsification has significant implications for program verification. The second theory claims certain software metrics predict defects in software modules. Its falsification casts doubts on whether metrics from one software project provide insight into defects of another project. The third theory claims using UML models (graphical representations of software architectures) lead to fewer software defects. Its falsification casts doubt on the utility of UML models. The three examples illustrate how the scientific process tests theories with experiments and debunks false claims. Besides being a catalyst for searching for better theories, falsification also helps practitioners free themselves from unworkable guidelines and techniques.
Copyright 2022, ACM, Inc.