Back

Making Assessments Work for Your Staffing Business

TRANSCRIPT

Jonathan Covey: All right. Wonderful. We should live here. So, Thank you everyone for joining this session. I know you had some options, so we're glad to have you here. And we're not going to disappoint. So, my name is Jonathan Covey. I'm a senior analyst at Talent Tech Labs and have the pleasure of moderating this discussion on assessments.

Jonathan Covey: Really the purpose here. As to talk about practical ways that staffing executives can use assessments to drive business results and just be more effective in the market. We have a number of very smart folks on the panel here. And they're representative of both some of the leading tech providers in the space, as well as the client side.

Jonathan Covey: So I'll go through the names here. We have Satish Kumar, the CEO of glider.ai. Glider.ai is a skill assessment specifically focused on assessing technical and coding talent. We have Dan Sines, co-founder of Traitify. Traitify Is a behavioral assessment and was actually recently acquired by paradox one of the leading conversational bot providers.

Jonathan Covey: So, interesting dynamic. We can hopefully talk about it there. We have Omer Molad the co-founder and CEO of Vervoe. It's bright and early for him in Australia. So we hope he has coffee in hand. Vervoe Is a skill assessment and a lot of capabilities around customizing assessment for different use cases.

Jonathan Covey: And then it's great to have Pat Rush the senior director at Kforce, a large staffing company that I'm sure you've heard of. So let, but before we jump into assessments, let's go around and hear from each of the panelists a bit on their background. And let's start with Satish. 

Satish Kumar: Sure. Hi everyone.

Satish Kumar: I'm atish Kumar, co-founder and CEO of glider.ai. We have been providing talent, quality evaluation and fraud prevention solution enterprise MSP, and the staffing firms. We are on a mission to make hiring fair and opportunity assessable. We have been working with a large set of stopping firms, to bring quality at the forefront of the contingent market.

Satish Kumar: In fact, more than 60% of the world's top staffing firms believe in glider to vet candidates, with confidence and to stand out from the crowd. Right? So we fundamentally believe in competency or credential. That means anybody who's capable should get an opportunity. That is what we are after. Right. And contingent leaders and research organizations agree with us as well.

Satish Kumar: No wonder last year we won the SIA shark tank award for the most innovative solution in the market. Right so in summary, we have built a solution for talent quality that empowers and protects the enterprise and the staffing agency. So I would love to discuss more about the assessment and evaluation solution.

Jonathan Covey: Thank you Dan Sines. 

Dan Sines: Hi everyone. My name's Dan Sines. I was the co-founder at Traitify. Traitify Was acquired last fall by paradox, a leading provider of conversational AI and really a company that's kind of creating the future of hiring. I'm now a VP of product there at a paradox and really excited about what we're doing with the original Traitify platform.

Dan Sines: And now combining that with the conversational AI technology, what Traitify developed was an image based assessment process, really designed for high volume hiring in particular, where we were able to deliver a personality assessment in 90 seconds. So much faster, more fun and engaging way to measure personality and candidates that really worked well for that high volume as a sector.

Dan Sines: Now we're able to kind of apply that within the chat as part of a more engaging continued candidate experience. My background is in design, and so I'm very focused on that candidate side of the equation and how we can design better experiences for the people that are going through the hiring process.

Jonathan Covey: Thank you, Omar. 

Omer Molad: Hey, good morning. Good afternoon, everyone. So at Vervoe, our mission is to make hiring about merit. We focus on putting candidates in the scenarios that would normally face the jobs we're trying to get as close as possible to simulate. The role and we test jobs specific skills in general work skills.

Omer Molad: And then we automatically grade the responses and rank candidates based on how well they perform. And we learn from each customer. So we continue to get better and better over time using our machine learning models. And we work with both corporations and agencies and staffing firms. Thanks.

Jonathan Covey: Excellent. And last but not least. Pat?

Pat Rush: Everybody. I'm Pat Rush. I'm a senior director of delivery transformation with Kforce. I've been in technical recruiting for 10 years and held recruiting roles, recruiting leadership roles. And now my focus for the firm is all around recruiting strategy and innovation.

Pat Rush: Part of my scope within K force is around assessments, how we utilize assessments you know, within our recruiting front and within our clients you know, looking forward to being here and having this conversation today. 

Jonathan Covey: Thank you. All right. Cool. Let's jump in. So, assessments. And really screening in general.

Jonathan Covey: It's a broad category. There's a ton of tools out there. And just within assessments, you have a behavioral scale gamified. All these different tools. And one of the things we spoke about last week is how you use these tools. It's much different on the contingent side and in a staffing environment than corporate full-time workers.

Jonathan Covey: And so that's another layer of complexity. I want to start with why even use assessments at all in agencies where in the funnel should it be used, who really owns the quality of the worker. And so let's start with that question, which is why should staffing companies even consider the use of assessments?

Jonathan Covey: Feel free to jump in with anyone. 

Satish Kumar: Sure I can take that. So awesome question. Right. So the goal of assessment in the funnel is to maximize the ROI for the staffing agencies. Right now, the staffing firms can use assessment for their just internal purpose with the candidate and added in, to find the pool of qualified candidates to present to the client.

Satish Kumar: Or you can actually take a step further. And actually presented the report with the client as well. Now, whichever way you decide to use, whether it's just for internal or to present to the client the place in the funnel where you should use assessment is only after you engage with the candidate.

Satish Kumar: You have talked to the candidate, set the right expectation about the process and only then send invitation for the system. What I've seen that it fails me miserable? Hey, somebody applied and Hey, take an assessment. Right? But also that kind of connect has to be there for you to expect candidate to commit time to you.

Satish Kumar: And it also shows that you value the candidate and are equally vested in bringing the opportunity to the candidate. So that's where you will place your heading for.

Jonathan Covey: Thanks. Omer any thoughts to share here on why even use assessments and staffing? 

Omer Molad: I think that, in addition to all the obvious reasons, the difference in the use case with starting firms is you're presenting someone to a client. And so. And so speed is important. And obviously the way that you present someone is important, particularly if it's contingent, you're trying to win.

Omer Molad: So you want to get your candidates placed over someone else's candidates, and you want to do that quickly. And, you know, in a way that is convincing. And so what, so an assessment and particularly the output from that assessment, it gives you a format. To present the candidate versus a resume or something more static like that.

Omer Molad: And that's a key difference between the staffing from use-case in a corporation where you're making your own decision, as opposed to presenting that information to someone else to make a decision. I think that's interesting, and you know, Pat's the best person to talk about that obviously.

Jonathan Covey: Yeah. And we'll jump to him in a second. Dan, any thoughts to share sticking on the provider side here? 

Dan Sines: No, I mean, I think I generally agree that we use these assessments to give you a more bias, free view. Information on that candidate. So it's an objective measure across all the candidates you're looking at, which can make that a lot easier to present to a client in a unified way.

Dan Sines: So I think that's a key portion of why you use assessments. I do think where they fit in the funnel, I maybe have a different perspective from Satish. Because I think it depends on what assessment that is. So, you know, in our case, like we're looking at personality and we have a very fast engaging kind of assessment format.

Dan Sines: That's quick. We want to do it higher in the funnel if possible. So you can kind of expand the reach of that and maybe reapply that candidate to a different role. But I completely agree in the premise was to teach that you don't want to put someone through something before there's a commitment level of some sort.

Dan Sines: You want to be really respectful of that candidate's time, especially in the current environment, we have a candidate shortage out there. So you have to really think through what's the journey that candidates going through with this role and where is it going to best fit. And do you need that information at this time?

Dan Sines: Or could you get that information at a later time? 

Jonathan Covey: Yeah, that's a good point. All different types of use cases here. And what you're measuring is different. So at what point in the funnel is that appropriate to capture? So Pat, tell us about some of your goals as an organization and using assessments and what that looks like at K force.

Pat Rush: Yeah. I think all the presenters are spot on here and I agree with them in, in, in different aspects. I think the way we look at it is. You know, there's different variables at play or different use cases for types of scenarios. And I can give a couple of different examples. You know, it may be where the client is engaged and they want to use some type of assessment as a part of their evaluation process of the candidate.

Pat Rush: You know, they may help determine at what phase in the process that they want to implement the assessment. It could be, you know, before they actually see the resume of that candidate, because they want to see the resume along with the assessment helps back up the skills the candidate has listed on their resume and can help facilitate the interview process.

Pat Rush: Sometimes we have clients that want to use an assessment as you know, a second round interview, like in and onsite different things like that. So, you know, having the flexibility of these assessments, we can kind of work with the client in, in, in what they require for their interview process and the goal is to present the best candidates for the right opportunity while improving the speed and the interview cycle for the clients. So it's, you know, a benefit for the candidates and the clients that front. A lot of staffing firms like us. We also have practice-based work where we're building actual project-based teams that we use assessments for as we're building out these project teams.

Pat Rush: So that's another variable that could be different. At what point we're using different types of assessments. As we're building the project team, we could use it. Top of funnel to gain interest in the project, you know, doing initial screening of candidates that just are not the right technical fit or whatever it might be for that project project based work.

Pat Rush: I also think, you know, this is, this use case is often overlooked, but by a lot of teams are, or just recruiters in general. The use case for interview preparation for candidates and prepping for these technical interviews, it doesn't necessarily always have to be about screening and or screening out candidates.

Pat Rush: Oftentimes if folks haven't interviewed for a new job in a long time you know, we do a lot of technical recruiting. So we're using a lot of the coding exercise, algorithmic based questions, like candidates aren't necessarily engineers and aren't necessarily writing algorithms every single day on the job.

Pat Rush: And they need to practice for that if they're going to be challenged on that and their interviews. So it's just a good opportunity for candidates too. You know, kind of just prepare for these interviews that they're going through. And that's something that, you know, we help our candidates with as well.

Pat Rush: So I think different variables you can look at and different places that you can plug these assessments in, but it's good to have, like the flexibility that a lot of these tools offer for us as a staffing firm, to be able to do that. 

Jonathan Covey: Great. Great. And I think you hit on an interesting point there, which is that assessments are, can be a tool to develop candidates.

Jonathan Covey: And it's being used certainly on the corporate side and talent management. And there are mobility implications around. I want to go a little deeper into the candidate side. And so Satish, I'll go to you because when we spoke as a group, we were talking about this perception by candidates.

Jonathan Covey: Oh, this is another thing I have to do in the recruitment process. Another barrier to entry, if you will, like, you know, I just want to get the opportunity, but you have an interesting point, which is, this is a way to prove your skills and really move faster through the recruitment process. So. In this candidate constrained environment.

Jonathan Covey: How do you position an assessment to be really a benefit to the candidate? 

Satish Kumar: Sure. So definitely assessments have benefits to the candidates. I mean, an automated assessment or a screening that is well calibrated presents unbiased data about the candidate competency, right to the hiring manager to take objectivity.

Satish Kumar: And that ultimately helps candidates as well. It helps them showcase their strength. It gets them jumped ahead in the hiring process in some cases, because that is how the client has designed their own hiring process. But more importantly, once they get tested, Staffing firms can actually present multiple opportunities to them as well as against getting interviewed by every client.

Satish Kumar: Right. But I think another point that Pat also pointed out, which is very relevant here, is that through the assessment process, hiring companies can actually find the opposite skilling opportunities by identifying the weak areas to the assessment. And they still go ahead and hire that candidate. Right. And this is basically relevant for the hot market.

Satish Kumar: And if you are really after veterans or people from economically poor backgrounds, it makes so much sense to identify the strength areas and hire based on that, given that the upscaling opportunity coming down the line. So, and what is the point of wanting to make sure that through the assessment, right? The capability part is evaluated.

Satish Kumar: Now as a candidate, I can spend quality time in the live group with humans to find the fitment with the group treatment, with the culture. Right? Those are the things that are proven to have higher job satisfaction. It's a win-win situation for the candidate as well, that the things that can be automated for capability evaluation, let's get it done through the assessment.

Satish Kumar: And let me spend time with the hiring manager, figuring out whether I embrace it right. A fit for this organization for this group or your culture or not. So there are so many benefits to the candidate and really about the messaging from the staffing firms. Right. You said earlier that it has to be presented as either benefit, but on top, if you have a higher quality benchmark and you share that with the candidate, it makes the potential look more aspirational as well.

Satish Kumar: They're good. People get attracted to the better quality jobs, right? So I think there are some of the points that can be used by the staffing firms to communicate and message assessment. 

Jonathan Covey: Thanks. Omar. Any thoughts on positioning assessments to candidates? Best practices you're seeing. 

Omer Molad: I mean, I think in short it's an opportunity to succeed, right.

Omer Molad: Not an opportunity to fail. That's the key. So, I mean, even the word, assessment's not a great word, right? It sounds like a root canal. I mean, so what it is it's an opportunity to showcase your talent. Okay. And be judged based on what you can do in your ability to do the job. Now, that sounds like something I want to do.

Omer Molad: Assessment might sound less like something I want to do as a candidate. Right? So, the key is not about examining you. It's about giving you an opportunity to show us what you can do and also get a realistic job preview and see what the role is like. And an opt-in that's the key. 

Jonathan Covey: Okay, great. And Dan, interested in what you have to say here, because there's a little bit of a different flavor with Traitify.

Jonathan Covey: You're kind of exposing to folks The personality, like a psychological profile and in some sense, helping promote self-awareness and that can be a good tool for candidates. Any thoughts there. 

Dan Sines: Yeah, absolutely. I mean, I first agree with both the guy's previous comments here on this, but in addition, you know, I think that the real benefit to the candidate is giving them feedback.

Dan Sines: You know, often you just don't get any feedback when you're going through the hiring process. And so Jonathan, you mentioned the use of assessments and talent management for learning. You can learn as a candidate. And I think that's a really great opportunity to let them learn more about themselves.

Dan Sines: Learn more about how to interview properly, just get general feedback that can help them advance in this job or the next thing they apply for so that they feel like the experience was worth it when they left. And that continues to create a good relationship between you and that person. If there's a future role where there might be a fit for them.

Jonathan Covey: Great. Pat I'll pose a different question to you. And you know, one of the things folks listening to this discussion are thinking about is like, there's so many assessments out there, particularly in the skill space, because they're so occupational specific, you got ones for programming, like the gliders of the world.

Jonathan Covey: Retail and different industries, hourly versus shift workers at K force. How did you guys navigate picking the right assessments for different segments? How can folks tackle that, that challenge? 

Pat Rush: Yes, that's a really good question. And there's a lot of different tools and potential solutions out there.

Pat Rush: I think at the fundamental level, you have to look at what is the true problem that you're trying to solve. What are your use cases that you then need to potentially attach a tool to, or find a solution for? And that's the approach that we took, you know, what are clients looking for? What are they asking for?

Pat Rush: What are our candidates looking for? What did they need to be successful? And that's at the start, when we were evaluating the different assessment tools that we use. That's kind the approach that we took when we were meeting with folks like Satish and Omer with their tools. And I think you can either look for a very narrow solution to a problem, or there are tools that are more broad that can solve more broad problems or use cases that you may have in front of you.

Pat Rush: In one tool may not be the best solution. You know, you may need to use multiple tools and that's something that, you know, as a firm that we do, we have a few different types of assessment tools for different use cases that work best. For us. I think another important thing to look at as well when you're evaluating different tools and vendors, once you kind of figure out what this problem is and how you think you need to solve it with whatever tool you're going on.

Pat Rush: And I think it's important to, of. At the company itself, like, what is their journey? What is their product roadmap? Do they, are they going down the product maturity that you need to see for your staffing business? Are they going to be able to evolve with you as the market, as dynamics, as things are changing?

Pat Rush: Within the talent landscape and we all know things change so quickly. So that's something when we were evaluating tools to look at as well, where are they at in their product maturity and where are they planning to go? And is this going to, does this fit with where we see the market going as well? So, you know, a lot of different things to look at, but I think again, the key is what's the problem you're trying to solve.

Pat Rush: What are the use cases for those problems? And then go to the vendor partners in these tools and figure out which one is the best fit for what you're trying to solve. 

Jonathan Covey: Cool. Yeah. And Satish one of the things you called out was. There's no silver bullet. You're, there's a need for different tools for different use cases.

Jonathan Covey: You know, you might use a paradox for screening on the front end and then glider to, for a different use case. Any thoughts to share with the participants on just how to navigate what is an overwhelming ecosystem of solutions? 

Satish Kumar: Yeah. Yeah, definitely. So, if you look at the hiring funnel, the purpose of evaluation could be different at different points, right?

Satish Kumar: At the top of the. In a way you want to engage with the candidate, but also want to little bit pre-screen, you know, are the, really the fit to, to take them further. So the top of the funnel screen bots are a good way to really engage with the candidate at a scale and get them pre-filtered right. For your role.

Satish Kumar: Then, if you go a little bit down it could be an assessment product, right, where you are trying to understand much, much deeper by simulating the kind of job in a more interactive and engaging way to get much deeper insight. And the 30 steps could be even a human conducting the live interview to the interview.

Satish Kumar: Where they are using tools for them to do the hands-on task, but there they're presented with the candidate as well. So if we look at it, depending on the client, some of them use all three in the process. Or if, for example, it's for non-tech functional roles, maybe the screen board is good enough, right?

Satish Kumar: Because they can really figure out the right traits to assess the candidates. Typically for the technology roles, they do want to go deeper in understanding so they can skip the screen board, but really want to get into a deeper evaluation through assessment or even through the live interview. So depending on the role, depending on the seniority as well, because, you know, if a, for senior candidates, they rather want to spend human time using the work in the tool and assessing the candidate.

Satish Kumar: So. Different dynamics play a role depending on the industry, depending on the type of evaluation and depending on the seniority level of the candidate, you can pick one or more methodologies. 

Jonathan Covey: Yeah. And then there's this other layer, which is that companies want to assess for the way that they do work.

Jonathan Covey: There's a cultural element. And then of course they're hiring for different personas, like you called out. And so it's like, How do you balance the assessment's ability to customize versus what's out of the box and a sort of library of job families and roles, and maybe Omer, I'll jump to you because. Vervoe Is one of those systems where you can sort create your own as well as use out of the box job roles and any thoughts on the balance of customization versus what you can get out of the box from a solution. 

Omer Molad: So, first I just want to come back to why Pat made a comment about understanding the problem you're solving, and that may sound obvious.

Omer Molad: I think we all know that's not always the case. And I think that's really important. Okay. And if you have clarity about what you're trying to solve, you're more likely to actually get the solution. That you made and Jonathan, to your point. So, we'll always lean to the side of a payload and in context.

Omer Molad: And so, you if you talk about resilience, a sales development rep gets salt, knows a lot. They need resilience about a 911 emergency operator and also need resilience. Resilience was clearly in a different work. So it doesn't make sense to test resilience. Generically. You want to test resilience in the context of the role of sign with attention to detail librarian versus CFO.

Omer Molad: So we would say make the assessment appropriate for the role to speak in context, get as close as possible to the role itself. What does that role look like? In the company that is hiring and what does success look like? So a graphic designer is not the same job in every company, despite having the same title.

Omer Molad: Therefore, the assessment shouldn't be saying same. 

Jonathan Covey: Thanks. Great. Let's shift gears. You know, it's fun to talk about technology and all the stuff that's out there, but business results are really important. So, on ROI maybe Pat I'll jump to you. Like how do you know that the assessment program and strategy is working?

Pat Rush: Yeah, I think that's a really important question and a good question. And it's an area that I spend a lot of time in just as we're, you know, looking at improving our overall recruiting strategy and evolving our recruiting strategy. I mean, data is king and luckily with a lot of these more modern assessment tools, we get a lot of data from.

Pat Rush: Which is great. Not only this of the assessments, you know, how candidates are scoring compared against your candidates versus, you know, all the candidates to take these assessments and then you can use your own data for it. So things that we look like are overall conversion metrics. We want to see an improvement.

Pat Rush: Okay. Our candidate funnel from top of the funnel, all through the, and the offer stage. So we look at things like the speed or conversion metrics from like an actual candidate to an interview from an interview to offer. Time to interview and offer different things like that.

Pat Rush: Like, are we improving the speed at which we can deliver quality candidates to our customers? We also measure engagement rate at the assessment level. What is the dropout rate? When we send a candidate and assessment, what percent of candidates actually complete that assessment or complete within, you know, a given time, which is also an important metric to measure, because if you're losing candidates in that process and then maybe there's tweaks, you need to make the, to the assessment to make it more you know, consumable for the candidate. 

Pat Rush: If it's taking too much time for candidates to complete, they're less likely to take the time to complete it. Different things like that. You know, we look at our client satisfaction scores as well, and we do different MPS, both on the client and candidate front improvement there.

Pat Rush: And then ideally at the end of the day, you know, we're looking at revenue and, you know, higher pay rates for candidates and higher bill rates as well. So there, you know, a multitude of different things that you can measure, but I can tell you from the maturity we've gotten to, with a number of our assessment tools, we have seen significant improvements, especially with the submission to interview. Submittal to Offer.

Pat Rush: Or smell of a higher, and then just the time that it takes to get candidates through the actual process is improving drastically as well. 

Jonathan Covey: Got it. Okay. So a lot of metrics you're looking out there and just a quick follow up. So it sounds like you're doing a look back and looking at client NPS scores. Are you also capturing how well the The candidate is doing in the role like performance.

Jonathan Covey: And then does that help inform what good looks like and how you go about using the assessment? 

Pat Rush: Yeah. Yeah, exactly. We look at all of that data when we correlate it back to assessment, we obviously don't use an assessment for every single role in every single position. You know, but based off how we have different items integrated into our systems, we can, we know if there was an assessment applied to a certain job or a certain candidate and things like that.

Pat Rush: So we can kind of measure it through. You know, through the entire project life cycle that the candidate was on assignment with our client and the NPS scores and the scores that we get based off of that. 

Jonathan Covey: Okay. Dan thoughts on ROI things to measure? 

Dan Sines: Yeah, I mean, I think that covered most of the most important ones in there.

Dan Sines: I think you know, for us completion rates, a big thing that we look at, we want to make sure that we're maintaining that we have a 95% completion rate across around 17 million assessments now. So. Always looking at and staying focused on I think time to fill is often something that's looked at as well.

Dan Sines: As a key metric. And then just, this could mean a lot of different things to a lot of different groups, so quality of hire. So that could be turnover, reduction, performance, and increased retention. There's a lot of different ways that could be looked at, but it's probably the most meaningful ROI impact you can generate.

Dan Sines: And the more that we can take assessment data and pair that with other data that. And create a feedback loop of that, the smarter and smarter we can get with that information in the future to improve that ROI. 

Jonathan Covey: Great. And Omer, you're doing some interesting things at Vervoe collecting a bunch of data sets.

Jonathan Covey: Could you talk to us about that? 

Omer Molad: Yeah. So we're closing the loop, which is sort of from our perspective, is that is the most important thing in addition to what both Pat and Dan said, all those metrics that are important, but. Well now I'm surveying the hiring manager 120 days after the decision was made, and we're asking a number of questions and really the goal is to understand was that high, a successful, are they performing in the role?

Omer Molad: And then what we're then trying to do is understand or correlate the. Post higher performance to the performance in the assessment. And then look at which skills mattered the most. And so the goal eventually is to get to a point where we can say, okay, these are the set of all the sales reps that you hire.

Omer Molad: These are the ones who are ramped quickest and got to quota quickest, or what, how did they perform in the assessment? Well, they're really good at these skills. And so that means that these skills correlate to performance. So let's, over-index on that test more for that, and then not in a generic sense, but, you know, role specific sense and sometimes company-specific sense.

Omer Molad: And so it helps inform the assessment, make the assessment better. It's a never-ending feedback loop. And so that's really part of our philosophies. You know, we don't know what good looks like. We know how to test. The client knows what good looks like the organization, hiring people. And so we want to help learn.

Omer Molad: We want to learn from them and use that to inform the way that we're evaluating candidates. 

Jonathan Covey: Yeah. Super important. Like. That closing the loop is super important. And traditional providers like an Aon or Willis tower Watson it's traditionally been pretty consulting heavy. But now with API APIs and integrations, you can start to connect systems and run these sorts of algorithms to understand what good looks like to your point.

Jonathan Covey: Satish what are you guys doing on that front? How do you think about that? 

Satish Kumar: Well, definitely I'm. So on top of what you just talked about, right? One of the things that we also track is candidate experience, and that is collected to the candidate satisfaction survey at the end of the test, that also gives us understanding how it can be taken as well.

Satish Kumar: But going back to this, connecting the feedback is very important because I think Omar said it correctly. That the, what is good is really defined by the client in a way. And we all know that the calibrations are different for different clients, right? What might be good for client 1 will not stand true for client 2.

Satish Kumar: So really the ability to capture the data downstream. And we do conduct the survey in a post hire as well and feed the data back, recalibrate the task for that. As to the closing back, this loop is very important. So overall one of the point that Pat mentioned is that he tracks no submission to interim interview to offer and something to hire as well among all these for us, the north stories is interview to offer. Why is that not an interview to hire because candidates can come back or for various reasons beyond the control of assessment. Right? What we want to measure very closely is that does hiring manager decision hiring decision is in line with the outcome of the assessment, which is the interview to offer.

Satish Kumar: Was it good enough looking at the competency report. And after that conducting interview, a willing to hire the candidate, I said that part would track very closely and recalibrate my position in that. Right. And so far customers have been able to achieve that interview, to offer a ratio in the range of 80%, which is significant for customers.

Jonathan Covey: Great. Great. And so in line with this conversation around data and closing the feedback loop kind of raises an important topic around integrations in general. Pat what do Integrations look like as a K force with these assessments in terms of where data is showing up our recruiters living within a native assessment like glider, or are they bouncing to the ATS?

Jonathan Covey: And what are some of the organizational roles where people are living and any best practices to share there. 

Pat Rush: Yeah. I I, for us, it's a combination of both. We have integrations with some of our tools and not with others, but I think in an ideal state, you want some sort of integration from the assessment to your ATS or whatever your databases, where you house your candidate information.

Pat Rush: Cause that's we talked about getting that full picture of the data. That's the easiest way to do that. You can use the assessment data along with your data that way. And it's much easier to track with the integration into ATS as we can also do things like search for, you know, candidate test scores and things like that.

Pat Rush: So candidates who have historically taken an assessment, you know, maybe we don't necessarily need to have them take one again, when they're coming off assignment. But if we can resurface that easily within the ATS keep the recruiter in their own base that they're working out of, you know, most of their day that's the most ideal scenario, but I do think there are needs to also be within the assessments will for different types of use cases, obviously like to send out assessments and things like that.

Pat Rush: Compare assessments to, you know, the broader data that the assessment tool gives you. But ideally if you can, and what's going to give you the biggest bang for your buck and be able to measure ROI. The vest is having some type of integration back into whatever, again, whatever system, whether it's an ATS or CRM or something like that, where you house that information is the most ideal.

Pat Rush: And we've seen the most benefit from that. 

Jonathan Covey: Got it. Okay. Dan thoughts on integrations and I'd love you to highlight specifically the Traitify paradox relationship and our Traitify assessments being deployed via a conversational bot today. What's to come on that. 

Dan Sines: Yeah. I mean, our relationship with paradox started with an integration.

Dan Sines: So I think, you know, that's how most of these things begin. I think just to answer the broad question first though, I think, you know, it needs to be wherever someone's going to. And often that's in the platform that they're on every day, not in a separate space. So you have to think about where they will actually get to this information so that it can be helpful for them in the case of what we have now with paradox, it is able to be delivered and deployed through Olivia through the chat experience and integrated through our backend experience as well.

Dan Sines: So you can quickly see and sort the candidates by fit, pull up their personality info, pull up interview questions all within that kind of integrated experience, we have an API as well, and that was kind of our basis for Traitify business. And so we integrate with ATS platforms. We can integrate with CRMs.

Dan Sines: We want to be wherever that data needs to be for you. That's where it's most useful as an assessment. 

Jonathan Covey: Okay. Thank you. Omer thoughts on integrations and user experience where people are leaving. 

Omer Molad: I might take the dissenting view. So I've got mixed feelings about integration. So everyone says they want integrations, then whatever the model place and believe me, I get it.

Omer Molad: I think when you're providing a reach out pool, like we are. There's an element of dilution. So we'll never be able to provide the same, the exact same level of insight and experience inside someone else's platform. And so what happens is you're going to get a portion. So we might be able to send the score or even the score by skill group, but we want you to get all our reporting and all the insight.

Omer Molad: So what ends up happening? I think there's a combination of time that you spend inside. Usually the ATS system, the sort of workflow system, which is convenient, but then when you want to go deeper, you end up coming back and our clients end up coming back and spending time in Vervoe at some stage. And so we know whenever we sort of, work through integrations that there are limitations in terms of.

Omer Molad: The other side, the platform reintegrating we've, how much can their API actually capture? I think that's an important point. So you want to remember that you're not necessarily getting everything that you. When you're using an assessment tool through the ATS, that's to understand that and then figure out what you are going to do in your workflow tool.

Omer Molad: And when do you actually go back for the deeper insight for the functionality of the assessment tool?

Jonathan Covey: Thanks. So, we're getting towards the end here. One of the things that came up when we chatted last week was that Pat, you said we, we gotta cover this. It was around fraud and assessments and validity. Satish, maybe I'll start with you because Glider is doing some pretty cool stuff around biometric tracking.

Jonathan Covey: And talk to us about the ethicality of some of these technologies like AI biometric tracking and how staffing companies should think about clients when it comes to this stuff. 

Satish Kumar: Okay. So just to be clear, you said two things, you said a fraud, then you added validity as well. To be clear the fraud part in the context of behavior and validate is in the context of known assessment by the provider.

Satish Kumar: So let me tackle the fraudulent behavior of candidate fraud first. Right. So. I'm sure that most of us know an audience who have been catering to the continuous market would have heard one or two, a story around candidate unfair practices in the hiring process, right? And the remote work has made the situation even worse during the pandemic area, because it's by somehow I can get through the vetting process that the entire work can be done remotely by somebody else at the, with much less risk of getting caught.

Satish Kumar: So, we track all this behavior, right? In fact, we have internal documents, which we call candidates for behaving badly and where we have documented various, you know, things that, that, that gets caught in the system. Right? Some of the stories are like no candidates are lip-sinking in the live interview and attributing the disconnect to the battery, bad internet connection.

Satish Kumar: Somebody else is taking a test on somebody else's behalf or they're taking help or the phone conversation, or to text WhatsApp or innovate somebody else's completing the test through the remote desktop sharing. Right. Somebody else is doing the test. Somebody else rings up the interview, all kinds of things happen.

Satish Kumar: In fact, now I'm hearing that people are taking multiple jobs. They're working remote. Nobody knows who with jobs they're working on. So one such bad apple can ruin the image of the staffing firm right eye of a client to the degree that they can lose the account. So these are serious stuff right now. Let me share some of the real data.

Satish Kumar: In our platform from 2020 to 2021, there is 92% increase, almost double in the fraudulent activity, in the remote interview, any sort in the remote assessment, right. And hear the funny part. This number has jumped despite the fact that there is an AI proctoring enabled and candidates are pretty much aware that they'll be monitored by the webcam, but it clicks stream by the audio, right?

Satish Kumar: Despite that, that much increase. We have a 50 checkpoint in an automated proctoring system that ensure the genuineness of the candidate's attempt. Right? So, it's a weird world. I will say. Remote work sometimes shows human behavior, but not being watched as well. But this is the reality right now. We live in and it is a very important part the way I put it, that it's not only your ability to assess the competency, but also to verify.

Satish Kumar: If the assessment was taken in a genuine manner, because unless there is a second component, your outcome might not be as loud. 

Jonathan Covey: Thanks. Satish Pat anything to share there around compliance. 

Pat Rush: I agree with everything that's a Satish and I think if anybody listening uses assessments, they probably run into similar situations.

Pat Rush: It has to become more of a problem since, you know, the big push to remote work. I do think that there's a scale of when we talk about fraudulent behavior with candidates, you know, some of the things that Satish has mentioned was you'd have some of them were the worst end of that. You know, where someone else is taking an interview, they're completely using outside resources to kind of try and game the system.

Pat Rush: You know, you will see with some of these technical assessments where, you know, candidates will use outside resources, like a Google, like a github, things like that to try and solve problems, you know, copy and paste the solution into the IDE. That may not necessarily mean that this is a bad candidate or a fraudulent candidate.

Pat Rush: Well, maybe the assessments are too difficult for them. Maybe they're not at the level that the back client was that client's looking for. And we need to position them into a better opportunity and have that open conversation. And for us, it really comes down to the relationship with the recruiter and the candidate, and that's core.

Pat Rush: Of all of getting a candidate ready to take any of these assessments. It still comes down to the recruiter's ability to build our relationship and put the candidate in the best position to be successful. So again there's a varying range of fraudulent behavior. Some of it's worse than others, but you know, some of these tools like your glider, AI, they really help staffing firms like, like K force you know, combat some of those issues.

Jonathan Covey: Thanks. And Dan on your side is fraudulent, a relevant thing really for Traitify and a paradox screening, I would think less so. 

Dan Sines: Yeah. It's not really as much of a thing. I mean, you can never say there's none. But I think the validity portion of your question is a bigger piece for us and making sure that, when you're looking at an assessment to use, it needs to be validated for the purpose you are using it for.

Dan Sines: And that's a big issue in the behavioral assessment world. You know, a lot of assessments are not validated for the purposes of selection, but yet used in selection. And that can create, you know, a lot of bad situations, EOC compliance issues for you and a bad experience for the candidates. So really doing your homework, read the manual that the company provides.

Dan Sines: They should have something like that gives you more depth on the validity and background of the tool. 

Jonathan Covey: Thanks Omar, anything on fraud

Omer Molad: So obviously all this stuff happens that I agree with, but I just, I, when I, when we talk about these things, it always makes me think about when we started doing banking and financial transactions on the internet.

Omer Molad: You know, when eBay launched and online banking started and there were people who said, hang on, there's no way I'm going to put my credit card on the internet because someone could steal it. It could end up somewhere. And there's obviously an element of fraud. But there's been financial fraud since the beginning of time with, or without the internet.

Omer Molad: I think it's an assigned thing here. So overall we're much better off being able to buy things on the internet, even though there's like a tiny percentage of fraud and eventually it gets insured and most of the time it's okay. I think, you know, the candidates had been lying on their resume since the beginning of time they'd been lying in interviews since the beginning of time.

Omer Molad: So now they're going to sell it, you know, a tiny minority, like Citi said about that. I going to game you the next frontier, that technology. And I think with remote working, when you're hiring people, you haven't met all over the world that you know, that's amplified a little bit. So it's obviously an issue but I also think like by and large, we're far better off and we just got to make sure we're being sensible.

Omer Molad: The other thing is Pat's point. We've got to distinguish between genuine fraud. You know, someone's actually, you know, paying their cousin to take the assessment instead of them. Okay. So that's like, that's identity, that's fraud. Okay. That's just lying versus, you know, so I went to law school and all the exams will open book.

Omer Molad: It's not cheating. It's an open book exam. And I think if someone can Google something and get to the answer, so I will, maybe they're more crafty than, you know, maybe that's good. In startups, we will use Google all the time to figure stuff out. Question is what are you trying to test? You know, is it something that it may be, the question was too easy if you can Google the answer in two seconds.

Omer Molad: So the assessment needs to be better and get people to actually do work and figure things out. So I, you know, I think we've got to distinguish between genuine fraud, which and there are measures to, to detect that and prevent them. We've all got them dressed. What are we actually trying to test?

Omer Molad: And what do we expect people to know in advance or figure out during the assessment? 

Satish Kumar: Can I make a comment on that? Sure. Yeah. Well, I agree with what Omar said, except one point that if we can do both in the sense that providing the audit trail of what actually transpired do not make the decision on the candidate behalf or on the client behalf.

Satish Kumar: You have the complete audit trail of what transpired. There are companies who really promote, Hey, good, do whatever search, copy, paste. And that at least part of the problem we had made and, but solved the problem. So providing that confidence and clarity and transparency on the assessment report that this is what happened.

Satish Kumar: See, if it makes sense for you to let it go or go after that. 

Jonathan Covey: Thanks. Great. So we probably have only time to go around one more time on what I think is an important topic here, which is the major pitfalls here to avoid when adopting and assessment. Mistakes that you've seen made on the tech provider side in working with clients and Pat, maybe lessons learned that K force.

Jonathan Covey: So let's talk about that. Who wants to jump in first? And then we can go around and wrap things up. 

Omer Molad: Happy to stop. I, you know, I think we touched on this earlier. I think it's setting people up for success. You don't want an assessment that's not appropriate for the role or the level of role or the context.

Omer Molad: You don't want something. You know, too long or too frustrating or you know, questions, you know, 50 video questions in a row with two minute time limits, all that kind of stuff. You don't want something that feels like it's torture. You want something that feels like these gives me an opportunity to show what I can do and to succeed, or, you know, , in Dan's case you know, it's personality, it's not right or wrong, good or bad.

Omer Molad: It's more about profiling and insight. So you want something that's appropriate and that feels like it's helping you succeed, not as a candidate, not helping you file. And I think that's a really good foundation to stop.

Pat Rush: I can take a swing at this next. I think I met, mentioned this in our prep. You know, these assessments at the end of the day, they're not a silver bullet. You know, they're not going to make the hiring decision on behalf of, you know, the client or whoever it might be involved in the hiring decision.

Pat Rush: Really what they are is they're a piece of the overall evaluation process. I've seen so many times we're working with a new client. That's gone through a series of bad interviews with candidates and they completely over-correct. They make an assessment and make it too hard. They think it's going to solve all their problems and then it ends up doing the opposite.

Pat Rush: It makes it even worse than they were before. So I think just understanding you use it as a piece of the overall evaluation process, the overall experience of the candidate, and from the staffing standpoint, at the end of the day, it still comes down to the recruiter's ability to build the relationship with the candidate.

Pat Rush: I understand where the candidate strengths are, what their motivators are and put them in the best position to succeed and find the right role. That's the best fit for them using an assessment to help validate, verify their skills and put the candidates best foot forward. And that's what we've seen being the most successful.

Jonathan Covey: Yeah. Excellent point. I mean, you can't just throw technology in there. The people aspect of this is incredible. And anything on this last topic major? 

Dan Sines: Yeah, I mean, I think maybe from a broad perspective is picking a company that's set in stone and not moving forward with their assessment process. I think Pat mentioned.

Dan Sines: Earlier, just, they're looking for companies to work with. We're evolving and growing and changing. The workforce is changing really rapidly and we have to keep up with that. And as assessment of providers, it's a key part of what we do, but a lot of the kind of old set ways of assessments don't work like that, they're kind of set and forget.

Dan Sines: But everyone who's on this call is really, I think, pushing for how we can continue to evolve and get better each time. So avoid that pitfall. 

Jonathan Covey: Yeah. Yeah, both are good. The people aspect, like what Pat mentioned, and then the tech things to make the process more efficient, get more data and make it more scalable.

Jonathan Covey: Omar.

Omer Molad: I I think I think I answered that. I think Satish and Jonathan left.

Satish Kumar: Yeah. I mean, definitely we'll be looking for the pitfalls, right. Are you going to buy yet another tool from a vendor or buy a business solution from a partner we're going to stay with you, right? That understanding is very important, but more importantly from the operational level inside the organization, getting all the stakeholders on your side is important because top executives might have the right intent and they feel good about the system.

Satish Kumar: Especially for the staffing agency, unless the recruiters who are going to actually use it day in and out, unless they are bought in the solution, they will consider it as another one more step and a hurdle in their placement process. So getting them on board is very important. Before you roll out the solution for staffing. 

Jonathan Covey: Excellent.

Jonathan Covey: Let's park it there. This has been a great discussion. I want to thank all of the panelists for joining and adding these insights, folks that are listening, you know, where to find these companies and you could reach out to me on LinkedIn, if you want to connect and explore some of these use cases further.

Jonathan Covey: But yeah, thanks so much. And hope everyone has a great rest of the conference this week. Awesome. Thank you. 

Pat Rush: Thank you.

Speakers

No items found.

Speakers

Omer Molad

Pat Rush

Satish Kumar

Jonathan Covey

Dan Sines

Duration

58

min

Watch Session now