ACM CareerNews for Tuesday, December 1, 2020
ACM CareerNews is intended as an objective career news digest for busy IT professionals. Views expressed are not necessarily those of ACM. To send comments, please write to email@example.com
Volume 16, Issue 23, December 1, 2020
The use of artificial intelligence technology continues to grow within the enterprise, and that is opening up entirely new job opportunities for IT workers within the AI field. In fact, a recent Analytics Insight report projects more than 20 million available jobs in artificial intelligence by 2023. Due to the transformational reach of AI, specialists with the right skills could find themselves with job opportunities across a wide range of industries. A global skills gap in the technologies means qualified applicants can expect good salaries and a strong bargaining position.
In 2021, there will be growing opportunities for AI specialists in public safety, banking and financial services, and healthcare. These three industries have the money to invest in AI right now and have the greatest opportunity to see the investment pay off quickly. That being said, the pandemic has caused industries hit the hardest to take a step back and look at how they can leverage AI and machine learning to rebuild or adjust in the new normal. Data collection and preparation, as well as data analytics expertise, could end up the most in-demand skills when hiring for artificial intelligence. Organizations need to hire individuals who can identify the correct training data and annotate the data accurately. They need talent that can maintain growing training sets and analyze the data to create targeted datasets for customized model generation. This means companies will require professionals familiar with algorithm tuning and training. Top candidates will also have experience in DevOps in order to successfully set up datasets and continuous integration and continuous deployment pipelines so that algorithms stay up-to-date.
In 2020, almost every software development company became a remote development company. Looking ahead to 2021, this trend appears likely to continue, and that could mean further changes and disruptions to the way teams work, and perhaps even the rise of new business models or software development approaches. Most developers adjusted well to the changes in 2020, certainly compared to other professions, even if it meant longer working hours, more remote work and greater overall uncertainty. In 2021, expect larger disruptions to the way development teams work, especially as the demand for new and modernized apps pushes business and development leaders outside their traditional comfort zones.
For 2021, Forrester Research predicts accelerated adoption of low-code platforms that will change how teams organize. During the pandemic, many organizations embraced low-code platforms to build and deploy new apps fast. These experiences will drive most development firms to adopt low-code tools next year. Expect to see new hybrid teams emerge, with business users and professional developers building apps together with low-code tools built on cloud-native platforms. Long-term remote working will increase the importance of digital collaboration, while agile scaling best practices emphasize the usefulness of cross-function, co-located teams. Physical co-location enables high-bandwidth collaboration, even with low-technology practices. Developers will need to make better use of collaborative work and value stream management tools, as well as new cloud-based team enablers like shared codespaces and pipelines.
Despite the coronavirus pandemic, the tech job market has remained remarkably resilient, especially in cities located far from both coasts. In fact, a recent report from WalletHub Communications suggests the job market has bounced back 46 percent from what it was at the peak of this crisis earlier this year. Moreover, there are several states in the country (such as Nebraska, Missouri and Kentucky) where job recovery is accelerating, and at least 10 cities where unemployment rates are well below the national average and tech employers are actively hiring.
Lincoln, Nebraska is a textbook example of a U.S. city that has held up well during the pandemic, while still boasting notable new tech job openings. Although COVID-19 cases are climbing again due to the cold weather, Lincoln, Nebraska has managed to weather the storm of this crisis and the current unemployment rate is at just 3.36 percent. In addition, there are a few thriving industries where employers are actively looking for tech experts. For example, Amazon Web Services is searching for a platform lead to support AWS environments. Springfield, Missouri is another example of a U.S. city with top employment prospects. If you are an engineer with a demonstrated history of excellence leading and managing projects for big companies, then this city has several job openings for software system analysts, data engineers, and other programmers looking to build an online brand. The unemployment rate is only 4.2 percent, compared to nearly 8 percent nationally.
Is No-Code Going to Steal Your Job?
Built In, November 13
No-code and low-code platforms are gaining in popularity as a way to develop software, and that could have huge implications for software developers and programmers going forward. In some cases, low-code development can help organizations deliver software 10 times faster than traditional programming. In other cases, individuals with almost no coding background at all are able to develop a minimum viable product in the course of just several days. Meanwhile, Gartner predicts that 65 percent of app development will happen on low-code platforms by 2024 and technologists are starting to evangelize for their favorite no-code tools. As their reach grows, they could have a profound impact on the developer job market and even how people define what it means to be a programmer.
On one hand, no-code and low-code platforms have the potential to boost the business value of programmers and non-programmers alike. Thanks to abstraction, non-technical employees can quickly create common types of applications and mold them to their immediate needs. Thanks to automation, developers save time on repetitive tasks like data entry or reporting. Of course, there are still plenty of ways for no-code and low-code to go awry. Both programmers and non-programmers can quickly lose track of the architecture of what they are building, which makes for poorly performing software. These systems work very well on the small scale but when you are trying to piece it all together and see how all these things interact, it just becomes very difficult. For example, a business analyst could build an elaborate CRM system and find that there is zero visibility into how a customer moves through that system, or a developer could find out too late that their app idea runs into platform-specific architectural limits. Those limits, however, are fading away as automation advances.
Why Data Science Is Still a Top Job
Datanami, November 16
This year, for the first time since 2016, data scientist is not the number one job in America, according to the annual ranking from Glassdoor. While some of the early hype about Big Data may have faded, data science is still a great field to go into, with abundant opportunities, competitive pay, and a job satisfaction that remains extremely high. With an average salary of $111,200 and a high average job satisfaction score, data scientist was at the top of the list four years running in the annual Glassdoor list of the top 50 jobs in the United States. In 2020, however, data scientist dropped to third place, behind front-end engineer and Java developer. During the early stages of the pandemic, data science dropped in priority, but as the recovery from the economic lockdown gains momentum, we could see a sizable increase in data science jobs and more discussion about how to leverage data within the enterprise.
In the long run, it would probably be unwise to bet against data science as a career move, especially when you widen the field to include related positions like research engineers and machine learning engineers. The U.S. Bureau of Labor Statistics sees strong, albeit tempered, growth for data science jobs skills in its prediction that the data science field will grow about 28 percent through 2026. No matter how you analyze it, data science is still a lucrative field with plenty of opportunity, which is why universities continue to add data science programs to their curriculums. There are more than 830 separate data science programs being offered from about 500 universities around the world. A Masters of Data Science is the most popular degree offered, while about 135 schools offer data science programs online.
Cybersecurity Careers: Which One Is Right For You?
We Live Security, November 13
The abundance of cyber threats and shortage of skilled professionals, as well as competitive salaries and interesting job descriptions, are some of the reasons why a career in cybersecurity remains an attractive option. However, choosing which career path to pursue may prove to be a daunting task, not least because there are so many careers to choose from, each with its specific requirements and skill sets. The good news is that not every cybersecurity career needs a university degree. If you are aspiring to join the swelling ranks of cybersecurity professionals, you will have to assess what skills you have and what skills you will need in order to apply for the position you want.
System administrator is one of the stepping-stone professions on the way to a cybersecurity career. Sysadmins need to have stellar knowledge of cybersecurity topics to perform their jobs properly. While a degree is not required, a Bachelor of Science (BSc.) in network administration is recommended for the role. People who lack the degree but are interested in pursuing these careers can do so by completing various certifications from reputable organizations. Sysadmins are indispensable for most companies, since they are responsible for the configuration, upkeep, operation, and security of computer systems and servers, as well as troubleshooting problems and providing support to other employees. If you are seeking to become a system administrator, then some of the top requirements are knowledge of Linux and of major networking hardware, network engineering, and tech support. To be able to transition successfully into cybersecurity, you would be well advised to add information security and systems, network security, and security operations to your arsenal of skills.
The Few, the Tired, the Open Source Coders
Wired.com, November 17
It may be time to re-think the open source coding movement, both in terms of how to compensate programmers and how to maximize the contributions of open source coders to projects. When the open source concept emerged in the 1990s, it was conceived as a bold new form of communal labor. And while open source has, overall, been a huge success, there have been some notable issues. For example, with the exception of some big projects (such as Linux), the labor involved is not particularly communal and the majority of the work lands on a tiny team of people. This can lead to open source burnout or even cases where coders walk away from projects entirely.
Experts are not quite sure what to do about open source burnout, but some think finding new sources of money for coders might help. For example, the team creating the open source language Rust is trying to set up a foundation to support core contributors and to encourage firms to keep contributors on staff. Some of the largest open source projects thrive in precisely this fashion. Firms like Facebook or Google pay some employees to work full-time on open source code. Subscriptions could offer new ways to pay for the work. However, others worry that injecting pay can deform how and why the work is done in the first place. But we need to rethink the very idea of what crowdsourcing is capable of and understand that it is perhaps more limited than promised.
Managers (Not Workers) Are Losing Jobs to AI and Robots
Forbes, November 15
Managers, not lower-level employees, are seeing their ranks diminished with the onset of artificial intelligence and robots, according to a new study from the Wharton School at the University of Pennsylvania. In part, this is because as AI expands within a business, managers can oversee a wider breadth of operations, and that reduces the overall need for supervisors. Somewhat surprisingly, robot-adopting firms actually tend to employ more people, not less: any displacement of labor comes from firms that do not adopt robots since these non-adopting firms actually lose their competitiveness and have to lay off workers. The most surprising finding from the study, however, was that AI and robotics adoption resulted in a reduction in management and supervisory ranks.
The decline in managerial opportunities in the enterprise is the result of vast efficiencies introduced to processes that once required a cadre of managers to oversee. There is less need for managers to ensure that workers show up on time or inspect their work since robots can record precisely the work they have done. Another surprise is the fact that AI and robots boost employment with both low-skilled and high-skilled jobs. Robots cannot directly substitute low-skilled workers and a manager can potentially supervise many of these workers at a time. For high-skilled workers, the effect is a little bit less certain. They can manage themselves and these high-skilled workers know how to do their jobs better than their managers. As a result, the career ladder is somewhat disrupted. For a low-skilled worker to become a high-skilled worker, they would need a college education and lot more training. Many low-skilled workers are just not able to have that kind of education for a variety of reasons. They are potentially being stuck in these entry-level jobs, with almost no opportunities or very little opportunity to advance.
Guidelines For KFH (Koding From Home)
ACM Queue, November 18
Programmers working from home during the pandemic need to change how they work in order to adapt to the new challenges faced by businesses globally. In some cases, this can be as simple as learning how to host better Zoom meetings. In other cases, it involves adopting entirely new behaviors and habits in order to become more productive and efficient. For example, simply changing your routine in order to wake up at the same time each day can make a big difference, as it can help you keep a regular working schedule and maintain close contact with co-workers.
One good habit for programmers to embrace is to set a finishing time for each day and stick to it. That being said, keeping a proper work-life balance for someone who is used to going to an office is harder when you switch to working from home. Suddenly you do not have a commute and can roll from bed to desk and back. Also, as a way to manage your time better and stay focused on the task at hand, take frequent breaks of at least 15 minutes per two-hour block throughout the day. Staring at a screen without interruption for eight to ten hours a day is too easy at home where there are no coworkers to interrupt you.
Will Post COVID-19 Education Be Digital?
Ubiquity, October 2020
The world is experiencing large-scale social and behavioral changes in response to the COVID-19 pandemic. These changes have the potential to cause a fundamental and profound shift in the way we work, which could have both positive and negative consequences. The need to function, both socially and at work, while sheltering at home and social distancing has led to the widespread realization that online meetings and remote working is viable. Digital education is also particularly important, as millions of people worldwide struggle to continue studies or learn new skills in these difficult circumstances. In fact, the rise of digital education and online learning could be one of the primary consequences of the global pandemic.
Of course, digital education has been on the rise for some time. There were many small experiments with distance learning in the 1990s. The first commercial MOOC companies started up after 2009 and soon thereafter, a few universities began to offer degrees composed of MOOC courses. The pandemic greatly accelerated the process of using distance learning because many teachers and students were confined to home and could only meet in Zoom sessions. Teachers and institutions have done quite well ascending a steep learning curve. Education will be a much bigger user of digital classes permanently and they will be cheaper than in-residence courses. Combined with other serious financial disruptors, mainly the sharp drop in students attending campus, this is likely to cause reorganization and downsizing at universities.
Copyright 2020, ACM, Inc.