ACM CareerNews for Tuesday, March 23, 2021
ACM CareerNews is intended as an objective career news digest for busy IT professionals. Views expressed are not necessarily those of ACM. To send comments, please write to firstname.lastname@example.org
Volume 17, Issue 6, March 23, 2021
There are 10 IT job titles likely to see significant growth in 2021 and beyond, including data scientist and machine learning specialist. Despite all the uncertainty surrounding the pandemic, these future career paths should provide some stability for technologists. By focusing on these IT job titles, technologists will not need to wonder if their skills are still in demand or whether they would be able to find a new job if they needed to. The good news is that there are a growing number of in-demand job titles for these turbulent and unusual times.
Data science has been one of the hottest job areas for several years now, and the pandemic has not changed that. In fact, data scientist is the job role with the highest increasing demand across industries. According to Robert Half Technology, the median salary for data scientists is $129,000. Moreover, Indeed.com had 13,299 jobs listed for data scientists. In order to get a job as a data scientist, you will need advanced data analysis skills and an advanced degree in math, statistics or a related field. Data analysts often work hand in hand with data scientists. While they do have a lot of experience with data analysis, data analysts generally do not have advanced degrees. Their compensation is a median of $103,250 in the United States, according to Robert Half Technology. There are more positions available for data analysts than data scientists. For example, job board Indeed.com had 19,241 data analyst positions listed.
How much your salary increases over the course of your technologist career depends on a variety of factors, including your specialization and career track. Data from the latest Dice Salary Report offers some key insights into what technologists make as they rack up years of experience. Overall, average technologist salaries in the U.S. increased 3.6 percent between 2019 and 2020, reaching $97,859. However, there is a significant difference in salary for those just starting out in tech and those with more experience. Those with less than a year of experience made an average of $57,031 in 2020, up 3.30 percent from 2019. Technologists did not begin to approach that overall average of $97,859 until they logged nearly a decade of experience.
At the furthest end of the experience chart, those technologists with more than 15 years of experience pulled down an annual average of $117,187 in 2020, up 2 percent from 2019 when average salaries came to $114,915. Those with 15 years of experience have often ascended into a managerial role and have the kind of specializations and skills that allow them to unlock considerable compensation and benefits. No matter how many years you have spent in the tech industry, there are lots of ways to boost your salary, including merit increases, negotiating for better pay, or developing new skills. Jobs that pay six-figure salaries include not only IT management (such as CEO, CIO and CTO), but also architect and engineering roles, particularly in hot areas such as cloud, cybersecurity, and data science.
Despite the pandemic, opportunities in IT are skyrocketing. According to the U.S. Bureau of Labor Statistics, employment for computer and IT-related roles is expected to grow 11 percent from 2019 to 2029, which is much faster than the average for all occupations. As hiring activities moved online during the pandemic, applying, hiring, and onboarding became accessible to more candidates. This increased flexibility, along with higher unemployment rates, has expanded the talent pool. Just keep in mind, however, that a higher volume of applicants can make it more challenging for a single candidate to stand out. For IT job seekers looking for a competitive advantage, the article provides 10 tips that can help you stand out in a crowded job market.
Make the most of job platforms and your network when looking for a new position. Hiring managers rely on employee referrals to find quality candidates since they tend to provide an initial layer of vetting. Recruiters and talent acquisition professionals also look to job boards and platforms like LinkedIn to identify potential candidates, making them a crucial asset to anyone seeking new job opportunities. Being active on LinkedIn matters because the LinkedIn algorithm prioritizes job seekers who are active on their platform. Additionally, your activity gives employers insight into your personality and interests, helping create that human connection. Pursuing informational interviews also helps. Early in your job hunt, reach out to your network and mutual connections. You can do this whether there is a job opportunity currently available or you are just interested in learning more about the organization or industry. Talking face-to-face (even over a video call) can provide a unique insight into a potential employer and may even give you a competitive edge like a referral or getting on a hiring short list.
Why You Should Learn to Code Regardless of Your Current Role
Silicon Republic, March 3
Learning to code now could future-proof your career for years to come. In its most basic sense, coding is translating logical actions into a language that a computer will understand. It is not just young people yet to enter the workforce who should learn to code: it is all of us. From the freelancer needing to make website edits to the finance team wrapping their heads around budget models, coding is for everyone and learning how it works will make us better at our jobs. As a result, many organizations are actively supporting the need of their employees to learn and develop coding skills.
As the world of work changes and more roles and tasks are automated, it pays to future-proof your career. According to McKinsey, AI and automation will transform the nature of work and the workplace. The company predicts that machines will be able to carry out more of the repetitive tasks undertaken by humans and, as a result, some occupations will decline or change, while others will grow. There will still be enough work to go around because technology will create new jobs and change others. But the workforce must adapt to these changes and learn new skills. This workforce will need to learn to co-exist with increasingly capable machines. What better way to do this than learning the languages that control these machines? Even if a chatbot or robot were to take over the administrative or repetitive parts of a customer service role, freeing the people up to do more uniquely human tasks, that technology still needs someone writing the code that feeds into it. This future of technology emphasizes the need to learn new skills and to consider where technology will play a role alongside your current skillset over the next five, ten or fifteen years. Once you know how your role may change, you can learn the skills that will see you remain employable.
Hiring the Right Person Is More Crucial Than Ever During COVID-19
Tech Republic, March 16
The IT job market during the global pandemic has been marked by instability. While some industries are thriving in the digital workspace, others are suffering. Many employees are facing layoffs or are forced to leave work because of increased pressure as caregivers. With jobs disappearing and reappearing in certain sectors, hiring managers have been facing unique challenges. Finding the right candidate is of critical importance during these uncertain times, and they are taking greater efforts to make sure that they do not recruit the wrong candidates.
According to a survey of nearly 3,000 senior managers conducted by staffing agency Robert Half in late 2020, recruiting the wrong candidate is far more common than you might think. A significant portion of those surveyed (76%) say that they have chosen the wrong person for the job and 64% believe that these wrong choices carry greater costs. Those surveyed pointed to a few reasons why these choices are critical, during a global pandemic or not. Time wasted for hiring and training was a negative consequence according to 37% of respondents. Others (20%) pointed to stress increases on supervisors, while 17% said that bad decisions hurt staff morale.
6 Warning Signs That You Should Not Take the Job
Entrepreneur.com, March 21
Many times, when young people are looking for a job for the first time, they are so eager to land a position that they do not adequately analyze whether it is the right place for their professional development. Yet, there are six red flags that indicate that it may not be a good idea to take a job that in a few months will make you question your willingness to accept the job in the first place. For example, if the job interview lists a lot more responsibilities than what was reported in the job description, pay attention. It is normal that not all the duties are detailed, but if the list grows alarmingly it may not correspond to the pay that is offered. Analyze if these other activities will help you gain new skills or if it will be your duty to meet obligations that should correspond to another area or person.
One potential warning sign is a lack of learning opportunities in your new job. You must analyze how much opportunity for growth the job will offer you, either through the activities that you will carry out on a daily basis or through additional career development options provided by your company. For example, your new company should offer training courses, or even encourage its employees to take courses elsewhere. Find out if the company allocates a special budget to this area. Even if you are not very clear about where you want to go professionally, there may be signs that the position is not for you. Think of this as time that you will spend doing something that will not make you happy or give you the experience or learning that you require. It really is a way of losing competitiveness, and this dissatisfaction could affect your productivity.
Choosing Cybersecurity as a Career
Analytics Insight, March 21
Cybersecurity is now one of the most sought after career options, especially for IT professionals looking to change their job function or role. Cybersecurity has a wide range of fields that one can choose to work in. To enter the world of cybersecurity, candidates should already have a solid base of technical knowledge as well as business insights into how and why cyber attacks are taking place within their industry of choice. Now is the time to pay extra attention to understand what is going wrong with current cybersecurity measures. Recruiters are looking to hire professionals who are savvy enough to keep these cyberattacks at bay.
A relevant degree in the field of technology makes it easy to get started. In addition to this, knowledge about advanced cybersecurity concepts or regulatory policies makes it all the more easy and convenient. If you do not already have this background, there are a lot of courses available that can help you. Both academic institutions and private corporations have developed an array of courses on cybersecurity that ensure that you get practical knowledge before stepping into the real world. You can choose from part-time courses, online courses, distance learning, certifications and diplomas. Today, a lot of colleges offer undergraduate as well as post graduate programs in the field of cybersecurity that students can take full advantage of.
Is AI Coming For Your Job?
Built In, March 10
Since the beginning of large-scale industrialization, automation has led to massive, widespread job losses. With the fourth industrial revolution in full gear, the fear is palpable yet again. Artificial intelligence has the potential to automate even very complex tasks, to say nothing of the simple and repetitive ones. That labor-saving potential only sounds great before you think about the social consequences of unemployed workers and cities in economic decline. However, there is reason to believe that history will not repeat itself when it comes to the AI-fueled revolution.
AI is poised to fuel explosive growth. Just remember that the rise of the internet also raised similar concerns about unemployment. Even though it became widespread a mere 20 years ago, it has already created millions of jobs since and comprises 10 percent of the U.S. gross domestic product. Perhaps more importantly, the internet has not been responsible for the decline of cities like automation in the car industry has. Currently, 63 percent of CEOs believe that AI will have an even larger impact than the internet, according to PwC. In their global AI study, PwC estimates that by 2030, the global GDP will have increased by 26 percent due to AI alone. That is more than the current GDP of China and India combined. In addition, the World Economic Forum estimates that 97 million new jobs will be created through AI by 2025. They also estimate that some 85 million jobs will be lost in the same period, but that is still a net surplus of 12 million jobs. That being said, the distribution of these jobs might be unequal. In Sweden and the U.S., productivity is projected to increase by more than 35 percent by 2035 due to AI. For France and Spain, however, the gains might be less than 20 percent.
The SPACE of Developer Productivity
ACM Queue, March 6
Developer productivity is complex and nuanced, with important implications for software development teams. A clear understanding of defining, measuring, and predicting developer productivity could provide organizations, managers, and developers with the ability to make higher-quality software. Developer productivity has been studied extensively. Unfortunately, after decades of research and practical development experience, knowing how to measure productivity or even define developer productivity has remained elusive, while myths about the topic are common. Far too often, teams or managers attempt to measure developer productivity with simple metrics.
However, productivity cannot be reduced to a single dimension or metric. The prevalence of myths surrounding productivity and the need to bust them has motivated the development of a practical multidimensional framework. Only by examining a constellation of metrics in tension with each other can we understand and influence developer productivity. This framework, called SPACE, captures the most important dimensions of developer productivity: satisfaction and well-being; performance; activity; communication and collaboration; and efficiency and flow. By recognizing and measuring productivity with more than just a single dimension, organizations can better understand how people and teams work, and they can make better decisions. This framework can be used to understand productivity in practice. It will also help organizations better understand developer productivity, create better measures to inform their work and teams, and may positively impact engineering outcomes and developer well-being.
AI or Intelligence Augmentation for Education?
Blog@CACM, March 15
For more than 50 years, computer science practitioners have been outlining how newly emerging computing technologies could help people work together. The big question now is whether the best approach going forward involves artificial intelligence (AI) or intelligence augmentation (IA). In fields such as healthcare and education, IA may actually be a more useful, ethical and equitable approach. IA is best defined as an alternative conceptualization that focuses on the assistive role of AI, emphasizing a design approach and implementation that enhances human intelligence rather than replaces it.
In education, applications of artificial intelligence are now rapidly expanding. Innovators, for example, are developing intelligent tutoring systems for some subjects. AI applications also include automatically grading essays or homework, as well as early warning systems that alert administrators to potential drop-outs. There are also AI products for online science labs that give teachers and students feedback. Other products listen to classroom discussions and highlight features of classroom talk that a teacher might seek to improve or observe the quality of teaching in videos of preschool children. A recent expert report about AI and education all uncovered visions for AI that would support teachers to orchestrate classroom activities, extend the range of student learning outcomes that can be measured, support learners with disabilities, and more. However, there are difficult problems related to privacy and security. Society, for example, has an obligation to protect the data of children. And there are even more difficult issues of bias, fairness, transparency, and accountability.
Copyright 2021, ACM, Inc.