Effective Skills Validation Can Solve Critical, Real-World Problems

This article was authored by Cushing Anderson, Program Vice President for IDC.

Technologies that enable digital transformation are now core to the business strategy and business success of most organizations. As an organization senses an opportunity, it may deploy a new approach or improve an existing one using technology to help streamline, automate or discover something, or connect better with its employees, customers or supply chain. These innovations always attempt to solve a business need.

Digital transformation (DX) is ushering in a new era of digitally enabled experiences and creating rapid change and uncertainty. At the same time, traditional businesses and their IT organizations face new competition and disruption from digital-native enterprises and start-ups that don’t “play by the rules,” that evolve quickly, combine new capabilities and business models and put existential pressure on established organizations.

Cushing Anderson

Program Vice President of Consulting, System Integration, HR and Learning Services for IDC

These trends put developing and deploying technology at the center of business growth for all companies, new and old. These trends typically require new operating practices, changes to organizational culture, improved focus on customer intelligence and often, new ways of thinking. Ultimately, it requires a team with IT roles and skill sets well aligned with the business and IT strategy of an organization.

People are the make-or-break element of a high-performing IT organization. There is simply no replacement for people with the right skills, attitudes and traits. CIOs don’t have the luxury of hiring a totally new roster of people with “the right stuff,” so they have to be creative in using the right mix of hiring, mentoring, training, contractors and partners to create a highly proficient team.

In fact, research on the performance of top performing teams in cloud-related functions showed that more than 85% of their members were rated as highly proficient in the relevant domain. Lower performing teams had fewer than 30% of highly proficient members.

But hiring and promoting the right people often seems like a guessing game. How do you know you found the right person to help pull off mission critical assignments and programs? What does it even mean to be “highly proficient?”

Many options for assessing staff are insufficient. Even in robust learning management systems, it is often impossible to recognize when an employee has sufficient skills in an area and would be suitable to take on an important new role. There seems to be an assessment gap.

This “assessment gap” manifests itself in several ways:

Inadequate tools for selecting qualified candidates from applicants

According to the WSJ, IT unemployment rate is estimated to be at a 20-year low. “The data confirms what employers have been saying for months and even years – the demand for tech talent has reached historic levels,” said Tim Herbert, executive vice president for research and market intelligence at CompTIA. (Matrix). The BLS (U.S. Bureau of Labor Statistics) suggests the unemployment rate for technology professionals was about 2.4% in February 2020.

Even so, there is no shortage of technology applicants. Despite a large number of unfilled positions, there were, on average, 43 applicants for every tech hire in 2018 (up from 36 in 2016) according to ZDNET. Given that recruiters spend only six seconds evaluating a candidate’s resume before deciding suitability for a role (according to the job board The Ladders), quickly recognizing if and applicant is likely to have critical skills is key.

 

How can we confirm candidates have the skills they claim on their resume?

Non-existent processes for identifying qualified internal candidates

Hiring people with critical skills is highly competitive. Many organizations are considering identifying “talent reservoirs” within their own organizations. According to Deloitte, unfortunately, 57% of employees think it’s easier to find a good job at another company than to transfer to a new job within their own organization. On top of that, only 40% believe their company is “good” or “excellent” at enabling internal talent mobility. In fact, almost 50% of organizations don’t have a process for identifying internal talent that might be suitable for open positions.

Even so, employees are still motivated to try to find different careers within their company. Employees may want to reinvent themselves but want to do so with their current employer. And, there is a significant benefit to organizations with agile career models. For example, Ingersoll Rand developed a “robust internal career program” to help employees reskill themselves for alternative roles and career across the company. Deloitte calculated this increased employee engagement within Ingersoll Rand by 30 percent.

How can we confidently identify motivated and capable employees who have the skills we need from INSIDE an organization?

Guessing how to staff critical IT projects

Several sources suggest a critical component of project success starts with team members having the right skills to contribute. IT leaders and project managers must know and define the skills and qualities that will drive the success of the project. This includes both interpersonal skills, such as dependability, communication skills and creativity, as well as specific technical skills that the current project requires. In some organizations, the approach is to simply ask the employee “how good are you at X.” possibly confirming this self-assessment with their supervisor or  others with which they have worked but this approach is unlikely to identify real skill.

Our research confirms that skills matter: IDC’s Cloud-Based Enterprise Application Performance Survey of more than 1,000 IT leaders worldwide found that well-trained cloud migration teams meet nearly 90% of their business and project milestones compared with less than 50% of milestones met by cloud migration teams at only “average” skill level.

Eighty percent of the organizations with sufficient skills in automation and orchestration tools report being satisfied or very satisfied with the business impact of the move to cloud. Only 20% of the organizations that lack sufficient skills report being satisfied with the impact of cloud has had on their business and digital transformation.

How can we reliably identify team members with the skills we need for a critical project?

Unnecessary retraining, undertraining or not training

It is fair to assume that in every organization, IT professionals are busy and don’t have time to take a course that teaches them something they may already know, or worse, teaches them something they don’t need to know. Similarly, with the technology environment continually changing – it is not uncommon for significant changes in cloud-based solutions to occur every month – IT professionals need to acquire new knowledge continuously. And this doesn’t account for the knowledge needed or new technology that has been deployed within an organization or is built into the IT strategy. IDC research has found that without a meaningful commitment to ongoing training and development, an IT team can lose as much as 45% of its capability in as little as 3 years.

How can we identify the most appropriate development opportunities for our IT professionals?

Common to each of these problems are two difficult to answer questions: What can a given IT professional actually do? And its complement: What does that IT professional need to learn? Increased use of well-designed assessments can provide a starting point for answering those questions. If those assessments include performance-based elements, we can get even closer to understanding the IT professional’s true skills and abilities.

IDC has built a maturity model that lays out the essential building blocks for a high-performing IT skill development program. Core to high performance is a deep reliance on skill assessment to help team leaders and IT leadership improve career development, successfully match IT professionals to roles and to align skill development initiatives with strategic IT programs.

Performance-based assessments replace traditional written tests with real-world tasks performed in live environments. This is crucial to ensuring that technical professionals’ training translates directly to the ability to perform tasks and solve real problems. Traditional learning assessments, questions, answers, most often written, can reliably discover what a person knows. Performance-based assessments, which presents a meaningful, complex problem in a live or simulated environment, can uncover how well IT professionals can apply what they know. (Research in adult learning theory suggests that challenge-based learning is also a highly effective approach to knowledge acquisition, not simply a method of validating skills, but that will need to be the subject of another blog post.)

Performance-based assessments can range in complexity from simple examples to complex scenarios where the IT professional develops or implements a complete solution to a problem. A well-designed performance-based assessment will measure a higher-level of thinking that requires combining knowledge and skill across the domain to create or do something that solves the problem and most important, closely mirrors real world situations.

So far, the only barrier to the increased use of performance-based assessments relates to availability: Are there enough appropriate challenges for the range of roles and skills in an organization to warrant a leap into performance-based assessments? Performance-based assessments are complicated to build and, like the technology skills they evaluate, are continually changing. However, a robust and evolving library of assessments, combined with the willingness to leverage the insight afforded by reliable information can help organizations solve many people-related problems that are critical to business success.

The value propositions for increasing the use of assessments, particularly performance-based assessments, seem clear:

  • Overall enterprise: Applicant skills validated through performance-based assessments increases “quality hire” rates, reduces hiring costs and ultimately increases IT organizational performance. Enterprises of all sizes should consider the impact of more thoroughly integrating performance-based assessments into new hire selection and internal transfer processes.
  • IT managers and supervisors: High competence improves on-the-job performance, career success and advancement of IT professionals. IT supervisors should leverage performance-based assessments to understand the true competence of IT professionals and guide them toward development or other opportunities that support both the IT strategy of the organization and the IT professionals career aspirations.
  • Project leaders and project sponsors: Leveraging performance-based assessments can help identity team members (and external consultants or contract hires) with the skill necessary to ensure more successful projects and increase the quality of technical solutions.
 

About the author

Cushing Anderson manages the research agenda, field research and custom research projects for IDC’s IT Education and Certification research program. Mr. Anderson’s research coverage ranges from the value certification provides to IT professionals to the selection criteria used when selecting transformation training for the IT organization. He conducts regular research on the views and experiences of IT professionals and IT education buyers. And he frequently evaluates the impact of various types of training and certification on IT organizational performance.

Background

Mr. Anderson was for many years responsible for IDC’s Worldwide Business Consulting research, evaluating the capabilities and impact of large multinational consulting firms. Mr. Anderson was also a research director in IDC’s Software Business Strategies group, and was responsible for managing the Partnering, Alliances and Licensing research services and IDC’s Software Alliance Executive Council. Prior to joining IDC, Mr. Anderson worked as a consultant for the Center for Educational Leadership and Technology (CELT, Corp.). Mr. Anderson also worked as a technology and management consultant with Coopers & Lybrand LLP, providing technology analysis, planning and benchmarking services to businesses in a variety of industries. Mr. Anderson was also a Lieutenant in the U.S. Navy where he coordinated ocean-going transportation during the first Gulf War.

Education and Industry Accomplishments

  • B.A. in Government from Connecticut College
  • M.Ed. in Curriculum and Instruction from Boston College’s Graduate School of Education
  • Board of Directors, Center for Talent Reporting
  • Former Editorial Advisory Board and columnist for Chief Learning Officer magazine
  • IDC Peacock Award winner for Professional Excellence, 2013
  • IDC GRAC Analyst of the Year, 2008

want to validate users' skills?

We can work with your team to add validated skills development to existing
hands-on labs OR start from scratch. Click the button below to request a call.