14 min read
    Phil Schrader

    As a people analytics leader, you’re going to be confronted with some not-so-simple, horribly open-ended questions: “Hey, so what do you want to measure?” Where should we start?” or… “What HR dashboards should we build?” Perhaps these words have been uttered by a well-intentioned business analyst from IT, peering at you from behind a laptop, eager to get your items added into an upcoming sprint before all their resources get tied up with something else. What do you say? Something really gnarly and fancy that shows your analytic savvy? Something that focuses on a key issue confronting the organization? Something basic? Fear not. In this blog post, we’ll walk you through eight essential people analytics dashboards. You should be able to get the HR data for all of these from your core HRIS or HCM solution, even if they’re in different modules and you have to combine it into one dataset. The key performance indicators (KPIs) in these views will give you the highest impact: Headcount Metrics Dashboard Span of Control Dashboard Employee Turnover Dashboard or Attrition Dashboard Talent Flow Dashboard Career Growth / Promotions Dashboard Diversity (DE&I) Dashboard Employee Tenure Dashboard …see below… 1: Headcount Metrics Dashboard Headcount metrics are the foundation of people analytics. Headcount speaks volumes. Trend it over time, break it out by key groupings, and you are well on your way to doing great people analytics. Here’s an initial view that captures the basics. Here’s what’s included in this dashboard so you can get a handle on headcount. In the upper right, you’ve got what I call the “walking around the number”. It’s not anything that will help you make an informed decision on anything. But this is the stat that you would feel embarrassed if someone asked you and you didn’t know off the top of your head. Here it’s the total number of employees as of the current point in time. (EOP is shorthand for End of Period. Be precise in how you define things. More on this at the end.) Next, you’ll want to see the headcount trended over time. Here we have a monthly trend paired with the same period last year. Boom. Now you can see how things are changing and how they compare with the previous year. Also, these two visuals are a great test run for your existing reporting and analytics capabilities. In the bottom right, here you have headcount broken out by org unit (or business unit, or supervisory org for you Workday types). Here you want not only the total counts but ideally a stacked column view so you can see the proportion of contractors, part-time, co-op, or other employment types. Different orgs might get their work done in different ways. You should know the differences. Finally, a map view of headcount by geography. It’s not a basic visual, but it has certainly become essential. Things happen in the world. You need to know where your workforce is so you can quickly estimate the impact and plan support. In just the past two years, employees have been impacted by wildfires, heat domes, political unrest, blizzards, cold snaps, flooding, and, of course, COVID. Geo maps have officially gone from fancy visual to essential view. 2: Span of Control Dashboard I’m going to change things up a bit by elevating the span of control to the second slot on this list. Don’t worry. We dive into attrition and representation later in the article. As a people leader, you’ve got to maintain some perspective on how efficiently your workforce gets work done. There are many ways to do this. You could calculate the total cost of your workforce. You could align those costs against revenue over time. By all means, do that. But this list is also there to help you get started. With just the data from your core HCM / HRIS system, your team should be able to show you the span of control and organizational layers. These metrics always remind me of stepping on a scale. If your span of control is ticking down, you’re getting less lean. If you’re adding more layers, your internal coordination costs are going up. There could be good reasons for this– but there sure as heck can be bad reasons for this. Here you’ll find your key Span of Control Metrics, your trend over time, and your layers and org units visualized. The real killer metric – if you’ve got the stomach for it – is a simple list of the number of managers in your organization that have only one or two direct reports. Use these views to keep your talent management processes grounded in business reality. If your existing team/technology can’t produce these views then shift them back. 3: Employee Turnover Dashboard or Attrition Dashboard Ok, we can’t go any further without employee turnover. Attrition if you’re feeling fancy. Turnover is the strongest signal you get from your workforce. Someone worked here and– for one reason or another– it didn’t work out. Changing jobs and firing an employee are both major events. Your workforce is telling you something and you need to listen to help you with employee retention. Here’s a basic view to get you started. Again, get your rolling 12-month termination rate up at the top and trend it out with the previous year for context. Below that, you see a breakout of voluntary and involuntary termination rates. Then, you can see breakouts by business unit, location, and org tenure groupings. Now with a glance, you can see how turnover rates are changing, where they are high, and whether it’s you or the employee forcing the change. Learn more how to calculate the cost of turnover. 4: Talent Flow Dashboard Once you’ve got a turnover view squared away, you can move into broader views of talent movement within your organization. Here’s a high-level talent flow view to get started. It leads off with a consolidated view of hires, terms, and net hires trend over time. I love this view because it lends itself to discussions of churn and the cost of turnover. The top area (green) shows external hires. The bottom (red) shows exits/terminations. The dark bars show the difference: net hires. The big question. How much of that time and money that you put into recruiting is just to replace the people who leave the company? A great variation on this view is to limit it to women or underrepresented groups. Are you working hard to attract these demographics, only to have them leave because they don’t find the organization to be a fit for them? We’ll get to more workforce representation views below. Next to the Net Hire Trend, you can mix in a growth metric and a helpful breakout by “business unit, so you can keep an eye on what segments of the organization are growing/shrinking. Are they the ones you expect? Later when you bring in data from other systems like learning, this view can be a place to collaborate with the learning team to answer questions like: Are you adding more employees, when you could be upskilling? Finally, get a solid crosstab view of promotions or movements. This will help you optimize talent development and answer questions like: Do people move from function to function? If so, what are the common paths? What paths don’t exist? Should they? 5: Career Growth / Promotions Dashboard After you get the big picture on movements, dig into promotions. In my mind, the movement and span of control views are about what the organization is experiencing. Promotions put you more in the mind of your employees and what career opportunities look like in your organization. I’ve added two of our key metrics to the top of this one. What’s the rate at which people get promoted and how long is the typical wait for promotion? Once you know the typical (average or median is fine) wait time, keep your ears out for high potential / high performers who have run past that mark. They’re probably keeping a rough estimate of that metric in their minds as well. Below that are two breakout views. The first one - “Manager Hires vs. Promotions to Manager” - is meant to look at a key milestone in career growth. I’ve used promotion to manager, but you might have unique ones. Then for each business unit, I’ve compared the number of promotions into that key group with the number of outside hires in that group. Are you growing your own leaders (or another key group)? If not, why? Filling out the bottom row is the “Termination Rate and Headcount by Time since Last Promotion” view. Look for two things here: 1) Do people leave if they don’t get promoted? 2) Do people leave right after they get promoted? 6: Diversity (DE&I) Dashboard It’s past time we brought in views of the diversity, equity and inclusion (DE&I) in your workforce. Many of the views in the dashboard below are split out versions of the metrics introduced above. Above is a sample diversity dashboard using male / female breakouts. Use this as a template for other representation breakouts including ethnicity, gender identity, age, etc. Any of these views could be modified to incorporate multiple, rather than just two, groupings. The top bar shows activity differentials over time. Hires are done simply as counts. Do you hire more men than women? Are promotions and terminations handed as rates to monitor for disproportionate outcomes?, i.e. are men promoted more often than women? The second row shows representation by key grouping in stacked horizontal bars. I like organizational layer and salary band to show if high career outcomes are disproportionate. I’d recommend the inclusion of tenure as well, however. If your organization had a history of disproportionate staffing, you will get a clue in this view. That could explain why today’s initiatives have not yet balanced out outcomes in level or pay. Or differences in tenure might be explained by differences in termination rates, depicted directly above in this view. This is a multifaceted issue. 7: Employee Tenure Dashboard Confession. I love tenure. I’ve come of age in my career amid data telling me that I’ll work for something like 11 companies before I retire. And, to be honest, I’ve done my share of career hopping. But it turns out that when you stick around somewhere, you learn things. You make connections with your co-workers. Employee tenure represents the accumulation of invaluable knowledge and connections that help you measure the value of your human capital. Next to average tenure, this dashboard shows the total accumulated workforce tenure in years. While not exactly a “walking around number,” you can use this to impress your fellow leaders into thinking about your workforce like the treasured asset it is. “Hey, our team has x millennia of accumulated experience!” Rounding out this view is a sorted view of positions or job titles with lots of accumulated experience as well as a stacked trend over time to see how tenure groupings are changing. 8: Dashboard Definitions and Details This final section is not a specific dashboard suggestion. Rather, it’s intended as a sobering reminder that none of the dashboards above will make an impact in your organization if you can’t explain your logic and build trust in your data. I like to build little glossary style views right into the dashboards I create. For example, at the bottom of our standard attrition storyboards, I’ve added breakouts showing which termination reason codes are included as voluntary and which are involuntary. Next to my glossary, I’ve created a table that breaks out the subcomponents of turnover rate, such as total headcount and days in period. I like to include at least one leap year for a bit of showmanship. “Look, I’ve even accounted for the fact that 2020 had 366 days, so back off.” Ready To Learn More? Get All Your Questions Answered One-on-one. Finally, if your security models and technology support it, drill to detail. This is the number one, all-time champion feature of people analytics. Click on headcount, terminations, whatever and see the actual people included in the data. Bonus points for adding the definition and “bread crumb trail” for metrics that build off of other metrics. Below is a view of how we do that in One Model. If you’d like to see these people analytics dashboards in action or learn more about people analytics software for your organization, reach out to us!

    Read Article

    10 min read
    Phil Schrader

    Succession planning is a strategic HR function. Its purpose is to map out key positions in the organization and identify potential successors who are (or will be) ready to step into those key positions when they become vacant. Organizations with effective succession planning programs are more resilient. When a critical role is vacated, they already know who can step up and fill the role. Succession management also boosts employee motivation because they can see a path forward within the organization. Strategic HR activities like this go hand in hand with People Analytics. In order to effectively plan for the future, you need clarity around what you want to accomplish and whether you are improving.. Metrics help you create that clarity. How many of our plans have successors? How ready are they? What’s our bench strength? Are our successors representative of the wider talent pool? So let’s dig in and talk about that union of strategy and analytics. How do you measure your succession plan readiness, and what are the key metrics for succession planning and leadership development? Measuring Succession Planning First, here is an "oldie but a goldie" video walking through the succession planning process. Second, here are the key elements of measuring succession planning. Scope: What are the critical roles that require identified successors. Ideally, your program covers all non-entry level roles, but time is scarce so prioritize. Coverage: Given the scope above, do you have plans set up for all critical roles? Readiness: Have you evaluated each successor’s readiness for each plan they are in? Remember that one person might be a successor for multiple positions, and they might be more ready for some roles than others. Readiness can be categorized in high-level groupings. For example, “Ready Now”, “Ready in < 1 Year”, and “Ready in > 1 Year”. Bench Strength: Given completeness and individual readiness, how strong is your bench? Can you fill all critical roles? Is it still strong if you net out the successors, i.e. account for people who are selected in multiple plans. Diversity: Does your plan make full use of the available talent in the organization? Have historical tendencies caused you to overlook strong successors because they have different backgrounds and experiences from the incumbents? Will your leadership ranks become more or less diverse when your plans move into action? It took me 44 minutes and 56 seconds to pull together the metrics above to answer Question #25 from the People Analytics Challenge. Let me show you the full Succession Dashboard. Connect with us today! Key Metrics (with Definitions) Here are the key metrics you can use to address the strategic questions above. Percent of Leaders with a "Ready Now" Successor Bottom line. What does your successor coverage look like right now? Count up the number of leaders who have a successor that is ready now. Divide that count by the total number of roles in your succession planning program (see Scope above). For example, if you have 10 positions that you’ve identified as needing a successor and you have a ready now successor for 7 of those roles, then your percentage of leaders with a ready now successor is 70%. Now flip that number around and say to yourself, “Ok if one of our really key people left today, there’s a 30% chance that we’d have no one ready to take over that position.” Don’t let that be you. Use the detailed data from this calculation to create an operational list of the positions without a successor. Then work the list! Gross and Net Bench Strength The first metric tells you how ready you are to move on from one key person. Gross and Net Bench strength give you a sense of how resilient your organization would be in the face of multiple changes. Technical note: These calculations will assume that your program has set out to have 3 successors identified for each key role. Gross Bench Strength: Total successors divided by total successors needed, ignoring whether the successors are used in multiple plans. Net Bench Strength: Total successors divided by total successors needed, only counting each successor once. i.e. taking into account whether the successors are used in multiple plans. So let’s look at these calculations together. Let’s say you have 10 key roles and you have determined that you should have 3 successors for each. That means your total successors needed is 30. Now go through your plans and add up all the listed successors. Perhaps you have 26. That means you have 26 successors out of the 30 you need making a gross bench strength of 87%. Awesome. Ok. Now let’s get more nuanced. Let’s deduplicate the list of successors. Maybe there are 2 high potentials in that pool who are listed on all 10 plans. Extreme example but useful for our illustration. That means that there are really only 8 unique successors. That makes your net bench strength 8 / 30 or 27%. This difference between a gross bench strength of 87% and a net bench strength of 27% tells you that you have good immediate coverage but low resiliency. You can effectively respond to 1 or 2 people leaving, but beyond that, your bench will be depleted. Incumbent vs. Successor Diversity % Generally speaking, today’s organizations are looking to take full advantage of their available talent by ensuring that traditionally underrepresented groups are considered for advancement. A simple way to check on this progress is to compare the representation numbers of your incumbents to the representation of your successors. Let’s suppose the current pool of employees in key roles is 10% diverse while your pool of successors is 20% diverse. This is a signal that your succession planning process will contribute to greater diversity in your key positions in the future. Remember to align your successor diversity metrics with the key groupings defined by your organization’s DE&I program. These could include gender, ethnicity, or other employee attributes. Promotion Rate and Time on Bench If you make progress on the metrics above, then you’ll be leading your organization into a more resilient future. Good job! But remember, resilience is great for the organization, up to a point. Remember that the high potential employees in your plans have their own career goals. If they feel stuck on the bench, they’re likely to find their next role outside the company. If you are so resilient that you could back up all your key leaders for the next 25 years, then you are fooling yourself. Those high potential employees listed on your plans will be long gone by then. So keep an eye on the promotion rate of your internal candidates over time. (Number of promotions / average headcount). They’ll be making their own estimates as well. Alternatively, you might calculate the time on bench for your successors. When one of your successors leaves the company, check to see if they were on the bench too long. Or just ask them in your exit interview. Pay particular attention to the time on bench for your diverse successors. It’s not enough to say, “Look at how diverse our bench is!” if those candidates are continuously passed up for the next big job. Using Successor Metrics to Support People Strategies The metrics above are just a starting point. The key to strategic HR and people analytics is a willingness to ask important questions and use data to answer those questions. Ideally your succession planning process fits into a larger talent management vision that is supported by a wide range of interconnected datasets and measures. For example, you may be ready to fill key roles with external candidates. Your time to fill for similar positions will help you know if that’s a reasonable backup strategy. Alternatively, your employee pulse survey data and turnover by attrition analyses may indicate that you are having a hard time retaining diverse employees. Perhaps this will link back to the time on bench calculations discussed above. You are unlikely to find meaningful answers in a single data source, so invest in building the right underlying data architecture to connect data from succession plans, core HR, recruiting, engagement, compensation, and other workforce data. At the same time, keep the strategic focus in mind so that you’re not just doing analytics for analytics sake. Come back to the important questions like, “If we lost someone in a key role today, what’s the percent chance we’d be totally flat footed with no idea how to replace them?”

    Read Article

    6 min read
    Phil Schrader

    It’s always good news when a prospective One Model customer tells me that they use SuccessFactors for recruiting. Given that HR technology in general and applicant tracking systems in particular seldom involve feelings of pleasure, my statement bears a bit of explanation. I wouldn’t chalk it up to nostalgia, though like many members of the One Model team, I had a career layover at SuccessFactors. Instead, my feelings for SuccessFactors recruiting are based on that system’s unique position in the evolution of applicant tracking systems. I think of SuccessFactors as the “Goldilocks ATS”. On one hand, SFSF doesn’t properly fit in with the new generation of ATS systems like SmartRecruiters, Greenhouse, or Lever. But like those systems, SFSF is young enough to have an API and for it to have grown up in a heavily integrated technology landscape. On the other hand, SFSF can’t really be lumped in with the older generation of ATS systems like Kenexa and Taleo either. However, yet again, it is close enough to have picked up a very positive trait from that older crowd. Specifically, it still manages to concern itself with the mundane task of, ya know, tracking applicant statuses. (Yeah, yeah, new systems, candidate experience is great, but couldn’t you also jot down when a recruiter reviewed a given application and leave that note somewhere where we could find it later without building a report???) In short, SFSF Recruiting is a tweener and better for it. If you are like me, and you happen to have been born in the fuzzy years between Gen X and Millennials, then you can relate: you're young enough to have been introduced to web design and email in high school, and old enough to have not had Facebook and cell phones in college. So let’s take a look at the magic of tracking application status history using data from SuccessFactors RCM, an applicant tracking system. While it seems like a no-brainer, not all ATSs provide full Application Status history via an API. Since it's basically the backbone of any type of recruiting analytics, it's fortunate that SuccessFactors does provide it. For those of you who want to poke around in your own data a bit, the data gets logged in an API object called JobApplicationStatusAuditTrail. In fact, not only is the status history data available, but custom configurations are accounted for and made available via the API as well. This is one of the reasons why at One Model we feel that without a doubt, SuccessFactors has the best API architecture for getting data out to support an analytics program. Learn more about our SuccessFactors integration. But there is something that not even the Goldilocks ATS can pull off -- making sense of the data. It’s great to know when an application hits a given status, but it’s a mistake to think that recruiting is a calm and orderly process where applications invariably progress from status to status in a logical order. In reality, recruiters are out there in the wild doing their best to match candidates with hiring managers in an ever-shifting context of business priorities, human preferences, and compliance requirements. Things happen. Applicants are shuffled from requisition to requisition. Statuses get skipped. Offers are rescinded. Job requisitions get cancelled without applicants getting reassigned. And that’s where you need a flexible people analytics solution like One Model. You’ll probably also want a high-end espresso machine and a giant whiteboard because we’re still going to need to work out some business logic to measure what matters in the hectic, nonlinear, applicant-shuffling real world of recruiting. Once we have the data, One Model works with customers to group and order their application statuses based on their needs. From there, the data is modeled to allow for reporting on the events of applications moving between statuses as well as the status of applications at any point in history. You can even look back at any point in time and see how many applications were at a particular status alongside the highest status those applications eventually made it to. And yes - we can do time to fill. There are a billion ways of calculating it. SuccessFactors does their customers a favor by allowing them to configure how they would like to calculate time to fill and then putting the number in a column for reporting. If you're like most customers though, one calculation isn't enough. Fortunately, One Model can do additional calculations any way you want them-- as well as offering a "days open" metric and grouped dimension that's accurate both current point in time as well as historically. “Days in status” is available as well, if you want to get more granular. Plus, on the topic of time to fill, there’s an additional tool in One Model’s toolkit. It’s called One AI and it enables customers to utilize machine learning to help predict not only time to fill, but also the attributes of candidates that make them more likely to receive an offer or get hired. However, that is another topic for another day. For today, the good news is that if you have SuccessFactors Recruiting, we’ll have API access to the status history data and customizations we need to help you make sense of what's going on in recruiting. No custom reports or extra connections are required. Connecting your ATS and HRIS data also means you can look at metrics like the cost of your applicant sourcing and how your recruiters are affecting your employee outcomes long term. So here’s to SuccessFactors Applicant Tracking System, the Goldilocks ATS. Ready to get more out of SuccessFactors? Click the button below and we'll show you exactly how, and how fast you can have it. **Quick Announcement** Click here to view our Success with SuccessFactors Webinar recording and learn how to create a people data strategy!

    Read Article

    10 min read
    Phil Schrader

    The One Model difference that really sets us apart is our ability to extract all your messy data and clean it into a standardized data catalog. Let's dive deeper. One Model delivers people analytics infrastructure. We accelerate every phase of your analytics roadmap. The later phases of that roadmap are pretty fun and exciting. Machine learning. Data Augmentation. Etc. Believe me, you’re going to hear a ton about that from us this year. But not today. Today we’re going to back up for a minute and pay homage to an absolutely wonderful thing about One Model: We will help you clean up your data mess. Messy Data? Don't distress. Josh Bersin used this phrasing in his talk at the People Analytics and the Future of Work conference. From my notes at PAFOW on Feb 2, 2018: You know there are huge opportunities to act like a business person in people analytics. In the talk right before Josh’s, Jonathan Ferrar reminded us that you get $13.01 back for every dollar you spend on analytics. But you have to get your house in order first. And that’s going to be hard. Our product engineering team at One Model has spent their careers figuring out how to pull data from HR systems and organizing it all into effective data models that are ready for analytics. If your team prefers, your company can spend years and massive budgets figuring all this out... Or, you can take advantage of One Model. When you sign up with One Model: 1) We take on responsibility for helping you extract all the data from your HR systems and related tools. 2) We connect and refine all that data into a standard data catalog that produces answers your team will actually trust. Learn what happened to Synk when they finally had trust. Big data cleansing starts with extracting the data from all your HR and related tools. We will extract all the data you want from all the systems you want through integrations and custom reports. It’s part of the deal. And it’s a big deal! For some perspective, check out this Workday resource document and figure out how you’ll extract your workers’ FTE allocation from it. Or if Oracle is your thing, you can go to our HRIS comparison blog and read about how much fun our founder, Chris, had figuring out how to get a suitable analytics data set out of Fusion. In fact, my coworker Josh is pulling some Oracle data as we speak and let me tell you, I’m pretty happy to be working on this post instead. Luckily for you, you don’t need to reinvent this wheel! Call us up. We’ll happily talk through the particulars of your systems and the relevant work we’ve already done. The documentation for these systems (for the most part) is out there, so it’s not that this is a bunch of classified top-secret stuff. We simply have a lot of accumulated experience getting data out of HR systems and have built proprietary processes to ensure you get the most data from your tools. In many cases, like Workday, for example, we can activate the custom integration we’ve already built and have your core data set populated in One Model. If you go down that road on your own, it’ll take you 2 - 3 days just to arrange the internal meeting to talk about how to make a plan to get all this data extracted. We spent over 10,000 development hours working on our Workday extraction process alone. And once you do get the data out, there’s still a mountain of work ahead of you. Which brings us to... The next step is refining your extracted data into a standardized data catalog. How do you define and govern the standard ways you are going to analyze your people data? Let’s take a simple example, like termination rate. The numerator part of this is actually pretty straightforward. You count up the number of terminations. Beyond that, you will want to map termination codes into voluntary and involuntary, exclude (or include) contractors, etc. Let’s just assume all this goes fine. Now what about the bottom part? You had, say 10 terminations in the given period of time, so your termination rate is... relative to what headcount? The starting headcount for that period? The ending headcount? The average headcount? How about the daily average headcount? Go with this for two reasons. 1) It’s the most accurate. You won’t unintentionally under or overstate termination rate, giving you a more accurate basis of comparison over time and the ability to correctly pro-rate values across departments. See here for details. And 2) If you are thinking of doing this in-house, it’ll be fun to tell your team that they need to work out how to deliver daily average headcounts for all the different dimensions and cuts to meet your cleaning data requirements. If you really want to, you can fight the daily average headcount battle and many others internally. But we haven’t even gotten to time modeling yet, which is so much fun it may get its own upcoming One Model Difference post. Or the unspeakable joy you will find managing organizational structure changes, see #10. On the other hand, One Model comes complete with a standard metrics catalog of over 590 metrics, along with the data processing logic and system integrations necessary to collect that data and calculate those metrics. You can create, tweak, and define your metrics any way you want to. But you do not have to start from scratch. If you think about it. This One Model difference makes all the difference. Ultimately, you simply have to clean up your messy data. We recognize that. We’ve been through it before. And we make it part of the deal. Our customers choose One Model because we're raising the standard and setting the pace for people analytics. If you are spending time gathering and maintaining data, then the yardstick for what good people analytics is going to accelerate away from you. If you want to catch up, book a demo below and we can talk. Tell us you want to meet. About One Model: One Model helps thriving companies make consistently great talent decisions at all levels of the organization. Large and rapidly-growing companies rely on our People Data Cloud™ people analytics platform because it takes all of the heavy lifting out of data extraction, cleansing, modeling, analytics, and reporting of enterprise workforce data. One Model pioneered people data orchestration, innovative visualizations, and flexible predictive models. HR and business teams trust its accurate reports and analyses. Data scientists, engineers, and people analytics professionals love the reduced technical burden. People Data Cloud is a uniquely transparent platform that drives ethical decisions and ensures the highest levels of security and privacy that human resource management demands.

    Read Article

    5 min read
    Phil Schrader

    Analytics is a funny discipline. On one hand, we deal with idealized models of how the world works. On the other hand, we are constantly tripped up by pesky things like the real world. One of these sneaky hard things is how best to count up people at various points in time, particularly when they are liable to move around. In other words, how do you keep track of people at a given point in time, especially when you have to derive that information from a date range? Within people analytics, you run into this problem all the time. In other areas, it isn’t as big of a deal. Outside of working hours (sometimes maybe during working hours), I run into this when I’m in the middle of a spreadsheet full of NBA players. Let's explore by looking at an easy-to-reference story from 2018. Close your eyes and imagine I’m about to create an amazing calculation when I realize that I haven’t taken player trades into consideration. George Hill, for example, starts the season in Sacramento but ends it in Cleveland. How do you handle that? Extra column? Extra row? What if he had gotten traded again? Two extra columns? Ugh! My spreadsheet is ruined! Fortunately, One Model is set up for this sort of point-in-time metric. Just tell us George Hill’s effective and end dates and the corresponding metrics will be handled automatically. Given the data below, One Model would place him in the Start of Period (SOP) Headcount for Sacramento and End of Period (EOP) Headcount for Cleveland. Along the way, we could tally up the trade events. In this scenario, Sacramento records an outbound trade of Hill and Cleveland tallies an inbound trade. The trade itself would be a cumulative metric. You could ask, “How many inbound trades did Cleveland make in February?” and add them all up. Answer-- they made about a billion of them. Putting it all together, we can say that Hill counts in Cleveland’s headcount at any point in time after Feb 7. (Over that period Cleveland accumulated 4 new players through trades.) So the good news is that this is easy to manage in One Model. Team Effective Date End Date Sacramento 2017-07-10 2018-02-07 Cleveland 2018-02-08 --- The bad news is that you might not be used to looking at data this way. Generally speaking, people are pretty comfortable with cumulative metrics (How many hires did we make in January?). They may even explore how to calculate monthly headcount and are pretty comfortable with the current point in time (How many people are in my organization). However, being able to dip into any particular point in time is new. You might not have run into many point-in-time scenarios before-- or you might have run into versions that you could work around. But, there is no hiding from them in people analytics. Your ability to count employees over time is essential. Unsure how to count people over time? Never fear. We’ve got a video below walking you through some examples. If you think this point in time stuff is pretty cool, then grab a cup of coffee and check out our previous post on the Recruiting Cholesterol graph. There we continue to take a more intense look beyond monthly and yearly headcount, and continue to dive deeper into point-in-time calculations. Also, if you looked at the data above and immediately became concerned about the fact that Hill was traded sometime during the day on the 8th of February and whether his last day in Sacramento should be listed as the 7th or the 8th-- then please refer to the One Model career page. You’ll fit right in with Jamie :) Want to read more? Check out all of our People Analytics resources. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own. Its newest tool, One AI, integrates cutting-edge machine learning capabilities into its current platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data.

    Read Article

    8 min read
    Phil Schrader

    I recently sat down with Culture Curated’s Season Chapman and Yuliana Lopez to ask them which metrics were their favourite and Yuliana said net hires. Let’s find out why: Net hires are a critical component of workforce management, as they help organisations determine staffing needs, forecast future headcount, and make informed decisions about recruitment and retention strategies. In this One Model blog post, we’ll explore the concept of net hires, how it’s calculated, and why it’s essential for organisations to track this metric. What are net hires? Net hires, also known as net hiring or net employment, is a measure that tracks the difference between the number of employees who leave an organisation and the number of new employees who are hired during a specific period. This metric provides valuable insights into a company's workforce dynamics, such as the rate of employee turnover, the pace of recruitment, and the organisation's overall hiring needs. This metric is an essential component of workforce planning and management, as it helps organisations to determine staffing needs, forecast future headcount, and make informed decisions about recruitment and retention strategies. For example, if a company hires 50 new employees during a quarter and loses 20 employees during the same period, the net hires for the quarter would be 30 (50 - 20 = 30). A positive net hires value indicates that the organisation is expanding its workforce, while a negative value indicates that the organisation is reducing its workforce. The chart below shows new hires juxtaposed against terminations. Explore several ways to visualize headcount here. I like using One Model for the presentation of this data because you can quickly adjust by any segment or time period to see how the story changes when looking at it from different angles. Calculating net hires To calculate net hires, organisations need to track the number of employees who join and leave the company during a specific period. This information can be obtained from various sources, such as HRIS records, payroll systems, and employee surveys. Once the data has been collected, organisations can use the following formula to calculate net hires: Net hires = Total number of new hires - Total number of terminations For instance, if a company hired 100 new employees and had 50 terminations during a specific period, the net hires for that period would be 50 (100 - 50 = 50). Having trouble balancing headcount with net internal movements? Learn more. Why is monitoring net hires important? Net hire headcount is a critical metric for organisations for several reasons. Firstly, they provide insights into the organisation's overall workforce trends, such as the pace of recruitment, the rate of turnover, and the company's growth trajectory. By tracking net hires over time, organisations can identify patterns and trends in their hiring practices and adjust their recruitment strategies accordingly. Secondly, net hires can help organisations by understanding the rate at which employees are joining and leaving the company, organisations can make informed decisions about their recruitment and retention strategies, including whether to ramp up hiring efforts, invest in employee training and development, or adjust staffing levels in response to changing market conditions. Finally, net hires can also help organisations evaluate the effectiveness of their recruitment efforts. By tracking the number of new hires, organisations can assess the success of their recruitment campaigns and identify areas for improvement. Additionally, by comparing net hires to other metrics, such as employee engagement and retention rates, organisations can gain a more comprehensive view of their overall talent management strategy. Challenges of tracking net hires While net hires are an essential metric for organisations, tracking this metric can be challenging. One of the main challenges is ensuring the accuracy of the data. HR records and payroll systems are prone to errors and inconsistencies, which can lead to inaccurate calculations of net hires. Moreover, tracking net hires requires a robust data infrastructure, including data collection, storage, and analysis tools. Another challenge is defining the period over which net hires should be calculated. Since you are measuring change over time, you could run into a situation where you get a zero result in calculated measures. In this case, having a tool that can understand and make sense of that is important. Organisations also need to determine whether to track net hires on a monthly, quarterly, or annual basis — depending on their specific workforce management needs. Moreover, organisations need to ensure that the period over which net hires are calculated is consistent across all departments and business units, to enable accurate comparisons. Optimising net hires To optimise net hires, organisations need to adopt a data-driven approach to recruitment and talent management. A key way to do that is by using people analytics tools to track and analyse workforce data, including net hires, turnover rates, and engagement levels. Final lessons from Season As you heard in the video, Season doesn’t like looking at one metric or a metric at a single point in time because it’s misleading. With that in mind, we know that net hires mean less if you don’t understand your termination metrics and recruitment rate. Remember to think of all the contributing factors and explore the data at your disposal to create a comprehensive story that creates value for your organization. The power of segmenting headcount In addition to looking at supporting metrics, you should also be segmenting your headcount audience to see if there are trends across departments or geography. Only looking at things as a whole may be misleading. That’s why using a tool like One Model with flexible storyboards is vital to put all the pieces of the same story on the same page. Make sure that a headcount dashboard is one of the first essential dashboards you build. Ready to Learn More? Watch me build this report live. Connect today.

    Read Article

    2 min read
    Phil Schrader

    It's an all too common scenario: a rush request coming in from the leadership team, in this case leadership in the finance department. They want to understand some cost impacts with our workforce and how it's changing. But it's Friday afternoon, so I want to try to get this request done as quickly and accurately as I can. Do you want to be able to knock out highly accurate, ad-hoc reports fast? Get a demo of One Model to see how!

    Read Article

    8 min read
    Phil Schrader

    Turnover is the strongest signal you get from your workforce. Someone worked here, and — for one reason or another — it didn’t work out. Voluntary termination of employment is a major event, and you need to pay attention to the reasoning (and the data) to help you with employee retention. While some degree of turnover is inevitable, the high cost of losing an employee can have a major impact on your bottom line. So, how do you calculate voluntary termination, and what can you do to combat it? Let's take a closer look. How to Calculate the Cost of Turnover There are a number of ways for calculating the cost of turnover. The most common (but less accurate method) is to multiply the average salary of the position by the number of separations. For example, if you have 10 employees who make an average salary of $50,000 per year and five resign, the turnover cost would be $250,000 ((10 x $50,000) / 2). However, this method needs to take into account the time it takes to find and train replacement employees. A more accurate way to calculate the cost of turnover is to use a formula that factors in recruiting costs, training costs, and lost productivity. Using this formula, the cost of turnover for our example above would be closer to $37,500 (((10 x $50,000) + ($5,000 x 10)) / 2). The ability to calculate voluntary attrition internally will bring a new dimension to your leadership team. However, these benchmarks serve as a baseline for your turnover calculator as there are several other variables and data points to consider, including: Daily rate of the hiring manager’s salary Estimated hours spent interviewing and screening resumes Estimated cost of advertising for the available position Daily rate of departed employee’s salary plus benefits Number of days the position will remain open before you rehire Cost to conduct a background check Daily rate of the hiring manager’s or trainer’s annual salary Total days the hiring manager or trainer will spend with new employee Number of working days in the new hire’s onboarding period It’s also important to factor in position levels. The Center for American Progress (CAP) found that the cost of staff turnover was, on average, 213% of the annual salary for highly-skilled employees. Segment the positions into three different salary levels for a more accurate turnover calculator: average entry level, average mid-level, and average technical salaries. Tracking Voluntary Attrition Over Time The simple truth is that you will not get a full picture of what is happening from a single calculation. It can be time-consuming to calculate on a consistent basis over time manually. You need to see the trends. Create a storyboard to see if trends emerge. You can break down the involuntary and voluntary attrition rate by business unit, location, and organization tenure groupings. You can also quickly see at a glance how turnover rates are changing, where they are high, and whether it’s you or the employee forcing the change. It took me 49 minutes to pull this cost of turnover visual together from scratch. How long does it take you to answer question #59? See how quickly you can take the People Analytics Challenge and answer over 90 of the top questions asked of People Analytics Teams. You Know How to Calculate Churn Rate of Employees and Turnover Cost — Now What? Now that we know how to calculate the cost of turnover let's look at how you can mitigate voluntary attrition in your organization. Use One AI Recipes to Predict Trends and Make Proactive Changes You know the “What” but you really need to know the “Why”. So, run a predictive model on that data to pull out the correlations and understand the why behind the attrition. One AI Recipes will help you predict the likelihood of a person in a selected population voluntarily terminating within a specified period of time. To do this, One AI will consider a number of attributes and will train the model on the population at a defined point in the past. Does distance to the office, time since last promotion, or paid sick leave correlates to a rise in attrition? AI is a tool that helps you make connections and better understand voluntary resignation reasons in order to take specific actions leading to improvement. Improve Hiring Practices Poor hiring practices could easily be one of the reasons why your voluntary attrition rate is high. Be sure to clearly define the skills and experience required for each position and only interview candidates who meet those criteria. Conduct thorough reference checks, and don't hesitate to pass on a candidate if there are any red flags. After analyzing your hiring process, you can incorporate the proper people analytics data to produce the most accurate cost of losing an employee. Promote from Within Another great way to reduce turnover is to promote from within whenever possible. Not only does this show your employees that there are opportunities for advancement within your company, but it also helps reduce training costs because you already have someone on staff who knows your company culture and how things work. Invest in Employee Development Finally, investing in employee development is a great way to reduce turnover rates. Employees who feel like they are learning and growing in their roles are more likely to stick around. Offer professional development opportunities through tuition reimbursement programs or paid memberships to professional organizations. Final Thoughts on How to Calculate the Cost of Turnover Sure we want to understand how much it costs the company. That is the first step in getting leadership to care. However, the real work begins when you understand why people are leaving and can build a plan to curb the costs. People analytics offers real-time labor market intelligence to help businesses identify pain points causing turnover. And considering the high cost of losing an employee and its impact on your bottom line, employee retention is critical in today’s economy. One AI Recipes make creating a predictive model from your people data as easy as choosing the outcome you want to predict and answering a series of questions about the data you want to leverage to make the predictions. The result is a predictive model based on robust, clean, and adequately structured data — all without engaging a data engineer or data scientist. Calculating turnover is the first step toward helping you understand and predict trends, reduce turnover rates, and keep your business running smoothly. Watch Me Build A Turnover Analysis Live Request a Personal Demo Today.

    Read Article

    5 min read
    Phil Schrader

    The Power of Combining Data Sources Am I weird for having a favorite metric that I always pull once I connect a customer's HRIS and Recruiting data for the first time? Oh well. Let's talk about my favorite merged source metric: First-year attrition by recruiter! I think it's one that can be useful for managing a recruiting function, but it's also a helpful classroom example to explain why we all need to merge recruiting and HR data. Connecting data across HR systems can be a tricky problem, but with the right tools, it is possible to gain valuable insights into employee behavior and business outcomes. In the video above, we explored how One Model can be used to blend data from different HR systems and gain insights into key metrics such as new hire turnover rate. Join the Conversation on Linkedin Segmentation is Key to Understanding Why Additional Data Sources Matter One of the key features of One Model is the ability to quickly break down data into more meaningful groups. Peter Howes will back me up in saying HR data without segmentation is worse than useless. To expand on my video example, by grouping turnover rate by year, we can get a better understanding of the overall trend in general employee retention. Additionally, by narrowing our employee outcomes analysis to specific subsets of employees, such as those who joined the company within the last year (or gender, department, etc), we can gain insights into specific areas of concern, such as early termination rates. But we can get these insights with data from 1 system. What happens when we combine data from say, our recruiting platform? You have the Power When You Join HR Data Sources Another powerful feature of One Model is the ability to connect data from multiple systems, such as recruiting data from your ATS and core workforce data from your HRIS (to use my video example). You can now make discoveries that actually improve processes within your organization. By connecting who has turned over with who actually recruited that person, we can make leadership decisions and work with L&D on potential coaching opportunities. Finding the “Why” After You Merge Recruiting and HCM Data Many people analytics teams (whether through intensive spreadsheet work or quickly using a tool like One Model) can create these insights, but interpreting these insights still requires the nuance and care of an HR analytics leader. Many struggle with providing the “why” behind the data. If you ask a seasoned recruiter they most likely will say that the number one reason is probably related to the applicant feeling mislead in the hiring process and that could increase new hire turnover. But are there other factors at play? Start with an exploratory data analysis and then get sophisticated with an AI engine that non-data scientist can actually use. Overall, the Explore tool in One Model makes it easy to connect data across HR systems and gain valuable insights into employee behavior and success rates. Whether you are an HR professional or a business leader, this tool can help you make data-driven decisions and improve your organization's performance. Want to See Phil Merge More Data? Schedule a Demo Today

    Read Article

    6 min read
    Phil Schrader

    It can be hard to select the right people analytics projects. There’s no shortage of options to choose from-- turnover risk, career path analysis, pay equity, impact of learning programs, etc. And, there’s also no shortage of hurdles to delivering an impactful analysis. But, there almost always is a shortage of organizational support for people analytics, so if you pick the wrong project, you run the risk of burning through the limited executive support you have. With all that in mind, here are my 3 reasons to consider prioritizing an analysis of recruiting source costs. Reason 1: The data requirements are relatively minimal and generally less sensitive Applicant Sourcing cost analytics minimize two of the main challenges of people analytics: data security and data extraction. While the data is based on job applicant data, the data you need is not overly sensitive. You just need to know how many applications you received from a given source, in a given time frame, and what ultimately happened to those applications. Aside from data validation, you don’t need to dig into the personal details of the job applicants for this analysis. You really shouldn’t hit too many data management, compliance, or security snags with this one. Once you’ve gotten ahold of your application data split by source, you need to gather up your job board pricing and other relevant source costs. While this data may not be tracked in your ATS, it shouldn’t be too hard to gather. You could literally start by grabbing a legal pad and jotting down the approximate annual spend by source. (Ok not a legal pad, but a very simple spreadsheet. We do not currently ingest data via legal pad.) This would be a great time to do a retrospective analysis of 2020 recruiting if you haven’t done so already. On the source cost side, you would just need something like this: That’s it. This could be the entire supplemental data set for 2020 like the example above. In a csv file that’s less than a kilobyte. Once you have this, you can do several things like compare to the number of applicants or the number of hires you had to get to the average cost of recruiting a new employee. Reason 2: Cost-per-hire metrics will come up time and time again. This brings me to reason two of why you should prioritize source cost analytics: it’s an opportunity to build your team’s experience with cost allocation metrics. Being able to efficiently and accurately allocate source costs will serve you very well-- opening the door to progressively more complex and impactful cost of workforce calculations. A couple of years ago, I built out a video that shows how this works in detail in One Model. It’s one of my favorite things about our calculation engine. You can check that out in the video below. The short answer though is that it hinges on being able to calculate recruiting cost per day, then dynamically managing days in period. It’s certainly okay and valuable to just do some back-of-the-envelope calculations at first like “$10,000 spent last year and 5,000 applications received equals $2 per application (or per hire)”, but also very worth investing in the logic and systems that enable you to do this on the fly and drill down to the month, week, day, etc. Recruiting source costs are a good gateway into cost allocation because, as noted above, the underlying data is pretty manageable. Once you can run this for source costs, then you can jump into salaries, benefit costs, etc. Reason 3: It’s an actionable analysis that can save your organization money! The third big reason to prioritize a recruiting source cost analysis is that it will probably pay for itself! Let's face it, recruiting employees is a time-consuming and costly process. It’s easy to help your recruiting team save money once you connect the dots between source spending and recruiting output. All you need to do is stack rank your sources by “cost per” and then reallocate your spend to the ones that perform the best. If your LinkedIn or Indeed rep calls you up and asks why you’re not spending more-- just show them the data. You might find they are willing to give you a better offer! I had front-row seats for an absolute master class in this back at Jobs2web. There I was working with Steve Shaffer, Linda Moller, and some of our early analytics customers (Here’s to you Annette and Brent!). People analytics teams are often criticized for not having enough ROI or dollars and cents style analyzes. You can change that impression with a solid source cost ROI analysis and an early win on saving costs. So there you have it- my argument for why a recruiting source cost analysis is a great “quick win” project for your people analytics program. If you’re curious about the details or underlying math, please reach out and schedule a time to chat. I do love this topic and am always up for a good conversation about recruiting data. Let's Get Your Recruiting Cost Conversation Started

    Read Article

    7 min read
    Phil Schrader

    People analytics provides insight into your organisation’s workforce. Your company’s workforce is at or near the top of your organisation’s expenses and strategic assets. Describing the importance of people analytics is very much an exercise in stating the obvious. For this reason, more and more companies are relying on people analytics, and that reliance is growing even as economic conditions change. In fact, as economic conditions become more challenging, people analytics becomes more, not less, important. Imagine a pilot flying in bad weather. Data on altitude, speed, location, etc become even more critical in that context. So yes, it makes sense to invest in people analytics now, even amidst our current economic concerns. People analytics in a recession is one of the most measurable strategies that HR can pursue. Whether you are hiring during a tight labor market or working through the implications of layoffs and reorganizations, you will want accurate, multi-dimensional, effective-dated, relational analytics ready to guide your decisions. People analytics doesn’t just help organise HR data. It generates faster insights from widely-dispersed HR data to make better talent decisions. For example, your people teams can better manage workforce and staffing levels, maximise productivity, and avoid guesswork about their diversity and inclusion objectives. “New and improved” HR reports alone won’t cut it. With people analytics, your analysts and managers can run exploratory data analysis to connect and understand relationships, trends, and patterns across all of their data. Additionally, the analysis adds context and meaning to the numbers and trends that you’re already seeing. The advantages of people analytics and why you should budget for it in a recession. Advantage #1 - Save money with people analytics. For nearly every business, labor is one of its most significant costs. But human capital is essential to generating revenue. HR analytics provides strategic and tactical visibility into one of your organisation’s most vital resources - its people. When your company uses analytics to manage the right people out, it can also use analytics to help you focus your recruitment efforts. After all, replacement costs for an employee can be as high as 50% to 60% with overall costs from 90% to 200%. For example, if an employee makes $60,000 per year, it costs $30,000 to $45,000 just to replace that employee and about $54,000 to $120,000 in overall losses to the company. HR analytics can also become a strategic advisor to your business to show insights into how your organization is changing. For example, people analytics can track trends in overtime pay, pay rate change for various positions, and revenue per employee (to name a few). While the revenue per employee calculation is a macro number, it’s important for you to be attuned to how it’s changing. Knowing the trends of your revenue per employee can lead directly to asking important questions about your people strategy: Are we investing in people now for future revenue later? Are we running significantly leaner than we have in the past? Are we running too lean? If metrics like revenue per employee or overtime pay are dropping or increasing over time, it could indicate that adjustments need to be made on a departmental level. Advantage #2 - Identify trends affecting morale or productivity. People analytics can also help you identify trends within your workforce that may be negatively affecting your business. HR data can help you pinpoint what is causing the change, and then address these issues early so you can avoid potential problems down the road. For example, Cornerstone used metrics such as policy violations and involuntary terminations to identify “toxic” employees harming the company’s productivity. The findings showed that hiring a toxic employee is costly for employers — to the tune of $13,000. And this number doesn’t even include long-term productivity losses due to the negative effects those toxic employees had on their colleagues. Source. With people analytics, Cornerstone identified common behavioral characteristics of toxic employees and now uses this data to make more informed hiring decisions. This created immediate benefits for their existing employees as well as future advantages as their workforce evolved. Advantage #3 - Recruit and retain top talent. The many benefits of people analytics also include a competitive edge when it comes to recruiting and retaining top talent. By understanding the needs and wants of your employees, you can create a workplace that is more attractive to potential candidates. In a world where data is constantly being updated, it's important for talent acquisition and HR leaders to make informed decisions quickly. HR analytics gives them that power at speed (rather than waiting months before seeing what happened). Using AI to discover related qualities of your top performers can also help your acquisitions team select candidates that will fit well into your culture and start driving results. Advantage #4 - Identify high-performing departments. Another one of the advantages of HR analytics is its ability to pinpoint positive changes as well. HR leaders can track department performance to know when to reward or incentivize employees for their productivity and work ethic. Additionally, it can help you keep your employees happy and engaged, which is essential for maintaining a high level of productivity (and sales). For example, Best Buy analyzed its HR data to discover that a 0.1% increase in employee engagement resulted in more than a $100,000 increase in annual income. Further, AMC’s people data showed that the theaters with top-performing managers earned $300,000 more in annual sales than the other theaters. These HR insights also helped this Fortune 500 company identify top talent and ideal candidates for its managerial positions, which ultimately resulted in a 6.3% increase in engagement, a 43% reduction in turnover, and a 1.2% rise in profit per customer. Identify Trends With Real-Time Labor Market Intelligence Ultimately, HR analytics offers real-time labor market intelligence to help businesses identify pain points causing turnover — something that’s essential in today’s hiring climate infamously referred to as “The Great Resignation.” The rise in turnover rates is a nationwide problem. It’s important for companies to find out why their employees are leaving and then create an effective strategy so they can stop the trend before it gets worse. One Model’s people analytics software can be a valuable tool for any business, especially during a downturn. In short: You should budget for HR analytics as an investment, not a cost. If you’re worried about a recession, you can start performing complex analysis on your data in just a few weeks. Let us show you 1:1

    Read Article

    9 min read
    Phil Schrader

    People analytics teams tend to shy away from calculating revenue per employee. It’s a very macro number. On its own, a single revenue per employee calculation tells you very little. Even large differences in revenue per employee do not necessarily mean anything or, at least, are not particularly “actionable” when immediately discovered. If I’m being honest, I feel like we people analytics folks have collectively decided we’re just a little too nuanced in our thinking to risk spending time on a data point that can be explained away almost as easily as we can calculate it. But then again, it’s pretty easy to calculate. And, while it is a very macro number, you still ought to be attuned to how it’s changing. I check the weather, not the climate, before going out camping for the weekend, but that doesn’t mean it isn’t important to know if the climate is changing. So put aside your nuanced expertise with me and let’s do some basic calculations! How to calculate revenue per employee? The most simplistic view of this metric is to use this formula: The average revenue per employee = Revenue during period / Number of employees in that period Ok, so let’s say you get a number like $200,000. So what? It’s not necessarily good or bad. There are happy shareholders out there whose companies have a lower revenue per employee and unhappy shareholders whose revenue per employee is much higher. So next you want to look at how that is trending over time, like so. (Source: One Model Storyboard Visualization for revenue per employee using test database) Trending is probably the best way to look at this data. Ideally, the number trends up, right? But it might not always. Perhaps you are growing and investing heavily in building out teams that will deliver revenue in the future. Perhaps your employee mix is changing. You’re adding call center employees instead of research scientists. Perhaps perhaps perhaps. And so, even though we might be able to explain any changes away at this point, that’s exactly the value of making the simple calculation in the first place. The resulting thought process leads directly into asking really fantastic questions about your people strategy. Are we investing in people now for future revenue later? Are we running significantly leaner than we have in the past? Maybe too lean? Pulling this data together into a simple graph took me 5 minutes and 24 seconds. Find out how long these types of insights take YOUR team to generate with the the People Analytics Challenge. Variations to Consider on Revenue per employee by industry Thinking about direct vs indirect contributors Ok, let’s get back to our thoughts about call center employees vs research scientists. Both are valuable but you would expect an organization with more of the latter to have a higher revenue per employee. So in your own analysis, you can do a couple of things next. Segment by employee roles and do the same trend analysis. Again the face value number might be meaningless. “We make $10,000,000 per accountant.” The trend, however, IS interesting. “Say, we seem to need a lot more accountants per dollar of revenue than we used to. I wonder why that is?” You could switch to a harder calculation like revenue per dollar of compensation. (check out Mike West. Estimating Human capital ROI. p.102 of People Analytics for Dummies.) While you may not always be able to attribute revenue earned to specific departments, it will give you more insights if you monitor changes over time. If revenue per employee is dropping or increasing over time, it could indicate that adjustments need to be made on a departmental level. Switching to net vs gross If you do find some interesting insights as you trend out revenue per employee, you may want to check whether you are aligned with finance on the dollar amount that best aligns with the company's business strategy. If not, presenting this information to the C-level could be a mistake. You may find it relevant to look at net income or the amount left after all expenses. Here are the various levels of income to think about. Gross Revenue Sales discount = Net Revenue Cost of goods = Gross Profit Operating expenses = Operating Profit or EBIT Loan Interests = Profit before taxes (PBT) Taxes = Net Income If you’re losing money, Net Income could help you determine which areas of the business may be contributing to that more than others. Again, these revenue per metrics are more of an early warning system than a root cause analysis. Use them as a scanner to home in on more specific analyses. Once you get going, you can start to cut the data in more ways to see what stands out. Grouping by Seasonality Measuring this metric by quarter or by month may give you an idea of how the business cycle evolves. However, be sure to compare similar quarters YoY (year over year) or Months YoY to get a better idea of if things are improving or getting worse over time. Considering the Role of Employee Turnover These last few years have been tough. As HR leaders, we know that getting new employees can directly impact the productivity of the company. If you have had a lot of turnover, you may want to align dates to see how this has impacted your revenue per employee ratio. Rather than celebrating a spike in revenue per employee, you might raise the alarm that you are stretched much thinner than before. Group by Tenure If new employees cost more, then your most tenured employees must be worth the biggest bang for your buck, right? Breaking your data out by various cohorts will give you a better idea of how effectively you're building your talent. Do you want to see how One Model builds the insights you need in under 6 minutes? Request a Demo Today! How to find a Revenue per employee benchmark for your industry? Another upshot of this relatively simplistic metric. You can often estimate it for publicly traded companies. The easiest way to do this is to choose at least 3 to 6 public companies in your space. Because they are publicly traded, finding that information on the investor relations section of their site is much easier than trying to collect that same intel on private companies. In most cases, you can find that information over time as well. Once you have that, finding the estimated number of employees can be found on many database websites, or even on Linkedin. That said, revenue per employee by the industry for private companies can require a substantial amount of work. Look for investor information, and search for news articles on earnings. You Found Your Average Revenue per Employee and Benchmark - Now what? For starters, don’t panic (or celebrate). If you notice big differences between your organization and others. A competitor may use a much larger number of contingent workers who don’t appear in their employee counts. Or have a very different support model. Again this might feel like it devalues the comparison because so much just depends. That said, doing the calculation suddenly gets you thinking things like, “Hey has their people strategy changed significantly?” “Do they use more/less contingent staff?” These are great questions! How Can You Change Average Revenue per Employee? Now that you’re in a strategic mindset, start thinking about the levers you have at your disposal in order to adjust these numbers: Add Additional Hires focusing on their ability to contribute to your top and bottom lines. [Fun Game] Make a bet with yourself about how it will impact revenue per employee in the short and long term. See if your performance and engagement data trends align with the departmental trends you’re seeing. Are high-performing teams trending up in revenue per employee? If so, how might you quantify the value and find new ways to invest in these areas? Create some buzz around the metric. Tell people that, of course you know it’s a macro data point, but get others thinking about how it’s changing. Take a break from fighting over how to measure promotion rates and enjoy the landscape view. If paired with a high turnover rate, work to retain high-performing employees Look at tools that may increase the efficiency of your workforce Develop management to maximize performance in each of their departments.

    Read Article

    10 min read
    Phil Schrader

    Post 1: Sniffing for Bull***t. As a people analytics professional, you are now expected to make decisions about whether to use various predictive models. This is a surprisingly difficult question with important consequences for your employees and job applicants. In fact, I started drafting up a lovely little three section blog post around this topic before realizing that there was zero chance that I was going to be able to pack everything into a single post. There are simply no hard and fast rules you can follow to know if a model is good enough to use “in the wild.” There are too many considerations. To take an initial example, what are the consequences of being wrong? Are you predicting whether someone will click on an ad, or whether someone has cancer? In fact, even talking about model accuracy is multifaceted. Are you worried about detecting everyone who does have cancer-- even at the risk of false positives? Or are you more concerned about avoiding false positives? Side note: If you are a people analytics professional, you ought to become comfortable with the idea of precision and recall. Many people have produced explanations of these terms so we won’t go into it here. Here is one from “Towards Data Science”. So all that said, instead of a single, long post attempting to cover a respectable amount of this topic, we are going to put out a series of posts under that heading: Evaluating a predictive model: Good Smells and Bad Smells. And, since I’ve never met an analogy that I wasn’t willing to beat to death, we’ll use that smelly comparison to help you keep track of the level at which we are evaluating a model. For example, in this post we’re going to start way out at bull***t range. Sniffing for Bull***t As this comparison implies, you ought to be able to smell these sorts of problems from pretty far out. In fact, for these initial checks, you don’t even have to get close enough to sniff around at the details of the model. You’re simply going to ask the producers of the model (vendor or in-house team) a few questions about how they work to see if they are offering you potential bull***t. At One Model, we're always interested in sharing our thoughts on predictive modeling. One of these great chats are available on the other side of this form. Back to our scheduled programming. Remember that predictions are not real. Because predictive models generate data points, it is tempting to treat them like facts. But they are not facts. They are educated guesses. If you are not committed to testing them and reviewing the methodology behind them, then you are contenting yourself with bull***t. Technically speaking, by bull***t, I mean a scenario in which you are not actually concerned with whether the predictions you are putting out are right or wrong. For those of you looking for a more detailed theory of bull***t, I direct you to Harry G. Frankfurt. At One Model we strive to avoid giving our customers bull***t (yay us!) by producing models with transparency and tractability in mind. By transparency we mean that we are committed to showing you exactly how a model was produced, what type of algorithm it is, how it performs, how features were selected, and other decisions that were made to prepare and clean the data. By tractability we mean that the data is traceable and easy to wrangle and analyze. When you put these concepts together you end up with predictive models that you can trust with your career and the careers of your employees. If, for example, you produce an attrition model, transparency and tractability will mean that you are able to educate your data consumers on how accurate the model is. It will mean that you have a process set up to review the results of predictions over time and see if they are correct. It will mean that if you are challenged about why a certain employee was categorized as a high attrition risk, you will be able to explain what features were important in that prediction. And so on. To take a counter example, there’s an awful lot of machine learning going on in the talent acquisition space. Lots of products out there are promising to save your recruiters time by using machine learning to estimate whether candidates are a relatively good or a relatively bad match for a job. This way, you can make life easier for your recruiters by taking a big pile of candidates and automagically identifying the ones that are the best fit. I suspect that many of these offerings are bull***t. And here are a few questions you can ask the vendors to see if you catch a whiff (or perhaps an overwhelming aroma) of bull***t. The same sorts of questions would apply for other scenarios, including models produced by an in-house team. Hey, person offering me this model, do you test to see if these predictions are accurate? Initially I thought about making this question “How do you” rather than “Do you”. I think “Do you” is more to the point. Any hesitation or awkwardness here is a really bad smell. In the talent acquisition example above, the vendor should at least be able to say, “Of course, we did an initial train-test split on the data and we monitor the results over time to see if people we say are good matches ultimately get hired.” Now later on, we might devote a post in this series to self-fulfilling prophecies. Meaning in this case that you should be on alert for the fact that by promoting a candidate to the top of the resume stack, you are almost certainly going to increase the odds that they are hired and, thus, you are your model is shaping, rather than predicting the future. But we’re still out at bull***t range so let’s leave that aside. And so, having established that the producer of the model does in fact test their model for accuracy, the next logical question to ask is: So how good is this model? Remember that we are still sniffing for bull***t. The purpose of this question is not so much to hear whether a given model has .75 or .83 precision or recall, but just to test if the producers of the model are willing to talk about model performance with you. Perhaps they assured you at a high level that the model is really great and they test it all the time-- but if they don’t have any method of explaining model performance ready for you… well… then their model might be bull***t. What features are important in the model? / What type of algorithm is behind these predictions? These follow up questions are fun in the case of vendors. Oftentimes vendors want to talk up their machine learning capabilities with a sort of “secret sauce” argument. They don’t want to tell you how it works or the details behind it because it’s proprietary. And it’s proprietary because it’s AMAZING. But I would argue that this need not be the case and that their hesitation is another sign of bull***t. For example, I have a general understanding of how the original Page Rank algorithm behind Google Search works. Crawl the web and work out the number of pages that link to a given page as a sign of relevance. If those backlinks come from sites which themselves have large numbers of links, then they are worth more. In fact, Sergey Brin and Larry Page published a paper about it. This level of general explanation did not prevent Google from dominating the world of search. In other words, a lack of willingness to be transparent is a strong sign of bull***t. How do you re-examine your models? Having poked a bit at transparency, these last questions get into issues of tractability. You want to hear about the capabilities that the producers of the model have to re-examine the work they have done. Did they build a model a few years ago and now they just keep using it? Or do they make a habit of going back and testing other potential models. Do they save off all their work so that they could easily return to the exact dataset that was used to train a specific version of the model. Are they set up to iterate or are they simply offering a one-size fits all algorithm to you? Good smells here will be discussions about model deployment, maintenance and archiving. Streets and sewers type stuff as one of my analytics mentors likes to say. Bad smells will be high level vague assurances or -- my favorite -- simple appeals to how amazingly bright the team working on it is.If they do vaguely assure you that they are tuning things up “all the time” then you can hit them with this follow up question: Could you go back to a specific prediction you made a year ago and reproduce the exact data set and version of the algorithm behind it? This is a challenging question and even a team fully committed to transparency and tractability will probably hedge their answers a bit. That’s ok. The test here not just about whether they can do it, but whether they are even thinking about this sort of thing. Ideally it opens up a discussion about you they will support you, as the analytics professional responsible for deploying their model, when you get challenged about a particular prediction. It’s the type of question you need to ask now because it will likely be asked of you in the future. As we move forward in this blog series, we’ll get into more nuanced situations. For example, reviewing the features used in the predictions to see if they are diverse and make logical sense. Or checking to see if the type of estimator (algorithm) chosen makes sense for the type of data you provided. But if the model that you are evaluating fails the bull***t smell test outlined here, then it means that you’re not going to have the transparency and tractability necessary to pick up on those more nuanced smells. So do yourself a favor and do a test whiff from a ways away before you stick your nose any closer.

    Read Article

    10 min read
    Phil Schrader

    Thanks for stopping by the blog to check out our work on integrating Workday, Greenhouse, and Engagement Survey data. Along with a video walking through the exact insights you can get, we use this blog to dive into key considerations when combining HCM, recruiting, and engagement surveys. If you want to chat through any of the ideas here feel welcome to schedule a time on my calendar. I'd love to chat: Why We're Even Talking about Workday Greenhouse Integrations with Survey Data. We started noticing about a year ago. Ryan and I would get a cool new lead that came in from a really exciting company to talk to, often based on the West Coast, often in tech. During our initial conversation, they would talk about workforce growth, diversity, and engagement. Then we’d ask about their system mix, and they’d say, “Well, we switched to Workday a couple of years ago, but we use Greenhouse for recruiting, and we have Culture Amp for surveys (or Glint or Qualtrics).” Jump to video Ryan and I started joking about how this was happening all the time-- to the point where we’d sometimes try to autocomplete “Culture Amp” for the person after they mentioned Greenhouse. (This totally failed on a recent call so we’ll stop doing that now.) Over the winter and into the spring Ryan and I’d periodically throw some time on the calendar to talk about this batch of companies we kept running into. We’d talk about the type of storyboards and views we might put together to focus specifically on them. Then the conversation would drift over into our mutual interests like land, soil, gardening, and regenerative agriculture. Video: Insights from Greenhouse, Workday and Culture Amp Eventually we were able to get some initial versions of these ideas built out in a demo One Model site-- and felt really excited that the inspiration we were finding out among the trees (Ryan in Vancouver) and fields (me in Texas) fit really well with the story we wanted to tell about how organizations grow over time. For me personally it was just so satisfying to take the analytic side of my world and have it elevate, rather than reduce the more organic, intangible and relationship oriented lessons I learn as a parent, a cook, and a gardener. (I also play tons of Call of Duty so don’t go feeling like you have to be some sort of woodland saint to appreciate this stuff.) In the video above we introduce some of these ideas for looking at your workforce, anchoring around the idea of treating hiring cohorts as organizational growth rings. In other words, starting with data from Workday (or whatever core HR system) and grouping headcount by the year they joined the company. For example, everyone from what you might call “the hiring class of 2015”. Reviewing Your Growth Rings for Real Workday & Greenhouse BI When you lay the data out like that it’s just flat-out interesting to look at. It gives you (or me at least) a cool hybrid-style view. It makes me think of the way that people invariably slow down and pause to appreciate the growth rings you see on a cross-cut section of the tree. On one hand, you get a definite feeling of growth and movement and activity. On the other, you get a sobering perspective on long-time scales. You need this appreciation when thinking about how human beings cooperate together and change as they do the work of your organization. This second feeling is a great counterweight to the action-oriented, get-it-done-now energy that we also must bring to our work. As we looked at these growth rings, Ryan and I started to deepen our appreciation of how much human experience is represented in those layers. How much somebody who has been around for 5 or 10 years has seen and learned-- all the things about the organization that are usually intangible and difficult to measure. We thought that it was a humble and human perspective on what our analytic minds would call human capital, but what we could just call out as accumulated human experience. From the growth ring analogy, you can start to mix in other people analytics perspectives like diversity. You can see that maybe your current headcount is trending in a more diverse direction but you're going to see (and your newer hires might directly experience) a lagging effect where all that accumulated human experience takes longer to become more diverse. So much of it has already been accumulated in prior years. In fact, that gap might give you more appreciation for inclusion efforts in your workforce because you can start to visualize the gap between a diverse headcount and an organization that has grown, developed, and incorporated a diversity of experience. And then we thought, “This would be the perfect place to layer in engagement data from Glint or Culture Amp or other surveys because you could see both the engagement of your people but also get that visual sense of the engagement of all that accumulated human experience. Ryan and I felt like that really boils a lot of people analytics down into something pretty simple. If someone comes into (or logs onto) work to start the day, and they’ve got 5 or 10 or more years of experience with your company’s products, services, customers, culture, networks, systems, coworkers, etc. AND they’re engaged and eager to dive back into that work-- well then you can’t really go wrong with that. What more could you ask for? You can’t really artificially assemble that. You’ve got to grow it. If you pull together some thinking on how a resilient ecosystem handles disruption and then think about what a wild, disruptive period we’ve been going through, then you just get filled with this desire to grow a diverse, resilient workforce to match. And we also started seeing how the work that talent acquisition does can be informed by and elevated by this view. Recruiting is often seen as the fast-paced (time to fill), process-driven (time in status) side of HR. But now we have a view that emphasizes the long term consequences of that frenetic activity. And we have a view that guides us in our analysis of that data. Greenhouse and Survey Data Adds Insight from the Beginning to the End of your People's Journey. Greenhouse is both perfectly named and well designed for this type of thinking. Instead of leaving all that scorecard data (for example) behind at the point of hire, why not look back on past growth rings and ask-- what did we learn from the interview process that might help us predict if a certain candidate will really take root and become part of the deep-tissue of our organization? Did we focus too much on the immediate skills they would bring, when it turns out that communication and adaptability were the things that really mattered? And so, what resulted from all these great conversations was the beginning of some new views on people data-- woven together from Workday, Greenhouse, and Engagement Surveys. We’ve captured this thinking in the video above. Please check it out if you haven’t done so already. As a final note, think of all the questions you could answer with a Workday and Greenhouse integration with survey data like: Are our employees happy with their work-life balance? This took me less than an hour to bring the data together and build out some initial visuals. Are you asking all the right questions? Read about our People's Analytics Challenge! Don't let our communication stop here! It’s already been rewarding for me personally-- and I hope that there are many more conversations to come that grow these ideas further. If you’ve got some of those next ideas or if you’ve got some questions about the views we put together-- grab some time to chat with me here:

    Read Article

    13 min read
    Phil Schrader

    As the people analytics leader in your organization, you are responsible for transforming people data into a unique competitive advantage. You need to figure out what it is about your workforce that makes your business grow, and you need to get leaders across the organization on board with using data to make better employee outcomes happen. How will you do that in 2019? Will you waste another year attempting to methodically climb up the 4 or 5 stages of the traditional analytics maturity model? You know the one. It goes from operational reporting in the lower left, up through a few intermediate stages, and then in the far distant upper right, culminates with predictive models and optimization. Here’s the Bersin & Associates one for reference or flip open your copy of Competing on Analytics (2007) for another (p. 8). The problem with this model is that on the surface it appears to be perfect common sense while in reality, it is hopelessly naive. It requires you to undertake the most far-reaching and logistically challenging efforts first. Then in the magical future, you will have this perfect environment in which to figure out what is actually important. If this were an actual roadmap for an actual road it would say, “Step 1: Begin constructing four-lane highway. … Step 4: Figure out where the highway should go.” It is the exact opposite of the way we have learned to think about business in the last decade. Agile. The Lean Startup. Etc. In fact it is such a perfect inverse of what you should be doing that we can literally turn this maturity model 180 degrees onto its head and discover an extremely compelling way to approach people analytics. Here is the new model. Notice the axes. This is a pragmatic view. We are now building impact (y axis) in the context of increasing logistical complexity (x axis). Impact grows as more people are using data to achieve the people outcomes that matter. But, as more and more people engage with the data your logistical burden grows as well. These burdens will manifest themselves in the form of system integrations, data validation rules, metric definitions, and a desire for more frequent data refreshes. From this practical perspective, operational data no longer seems like a great place to start. It’s desirable because it’s the point at which many people in the organization will be engaging with data, but it will require an enormous logistical effort to support. This is a good time to dispense with the notion that operational data is somehow inferior to other forms of data. That it’s the place to start because it’s so simplistic. Actually, your business runs operationally. Amazon’s operational data, for example, instructs a picker in a warehouse to go and fetch a particular package from the shelves at a particular moment in time. That’s just a row of operational data. But it occurs at the end of a sophisticated analytics process that often results in you getting a package on the very same day you ordered it. Operational data is data at the point of impact. Predictive data also looks quite different from this perspective. It’s a wonderful starting point because it is very manageable logistically. And don’t be put off by the fact that I’ve labeled its impact as lower. Remember that impact in this model is a function of the number of people using your data. The impact of your initial predictive models will be felt in a relatively small circle of people around you, but it’s that group of people that will form your most critical allies as you seek to build your analytics program. For starters, it’s your boss and the executive team. Sometime around Valentines Day they will no doubt start to ask, “Hey, how’s the roadmap coming along?” In the old model, you would have to say, “Oh well you know it’s difficult because it’s HR data and we need to get it right first.” Then you’d both nod knowingly and head off to LinkedIn to read more articles about HR winning a seat at the table. But this year you will say, “It’s going great! We’ve run a few hundred predictive models and discovered that we can predict {insert Turnover, Promotion, Quality of Hire, etc} with a decent degree of recall and precision. As a next step, we’re figuring out how to organize this data more effectively so we can slice and dice it in more ways. After that we will start seeking out other data sets to improve our models and make a plan for distributing this data to our people leaders.” Ah. Wouldn’t that feel nice to say? Next, you begin taking steps to better organize your data and add new data sets. This takes more logistical effort so you will engage your next group of allies: HR system owners and IT managers. Because they are not fools, they will be a little skeptical at first. Specifically, they’re going to ask you what data you need and why it’s worth going after. If you’re operating under the old model, you won’t really know. You might say, “All of it.” They won’t like that answer. Or maybe you’ll be tempted to get some list of predefined KPIs from an article or book. That’s safer, but you can’t really build a uniquely differentiating capability for your organization that way. You’re just copying what other people thought was important. If you adopt our upside down model, on the other hand, you’ll have a perfectly good answer for the system owners and IT folks. You’ll say, “I’ve run a few hundred models and we know that this manageable list has the data elements that are the most valuable. These data points help us predict X. I’d like to focus on those. “Amen,” they’ll say. How’s that for a first two months of 2019? You’re showing progress to your execs. Your internal partners are on board. You are building momentum. The more allies you win, the more logistical complexity you can take on. At this stage people have reason to believe in you and share resources with you. As you move up the new maturity model with your IT allies, you’ll start to build analytic data sets. Now you’re looking for trends and exploring various slices. Now is the time for an executive dashboard or two. Now is the time to start demonstrating that your predictive models are actually predictive. These dashboards are focused. They’re not a grab bag of KPIs. They might simply show the number of people last month who left the company and whether or not they were predicted by the model. Maybe you cut it by role and salary band. The point is not to see everything. The point is to see what matters. Your execs will gladly take three pieces of meaningful data once per month over a dozen cuts of overview data once a day. Remember to manage your logistical commitment. You need to get the data right about once a month. Not daily. Not “real time.” Finally, you’re ready to get your operational data right. In the old world this meant something vague like being able to measure everything and having all the data validated and other unrealistic things. In the new world it means delivering operational data at the point of impact. In the old world you’d say, “Hey HRBP or line manager, here are all these reports you can run for all this stuff.” And they would either ignore them or find legitimate faults with them. In the new world, you say, “Hey HRBP or line manager, we’ve figured out how to predict X. We know that X is (good | bad) for your operations. We’ve rolled out some executive dashboards to track trends around X. Based on all that, we’ve invested in technology and process to get this data delivered to you as well. X can be many things. Maybe it’s a list of entry-level employees likely to promote two steps based upon factors identified in the model. Maybe it’s a list of key employees at high risk of termination based. Maybe it’s a ranking of employee shifts with a higher risk of a safety incident. Whatever it is for your business, you will be ready to roll it out far and wide because you’ve proven the value of data and you’ve pragmatically built a network of allies who believe in what you are doing. And the reason you’ll be in that position is because you turned your tired old analytics maturity model on it’s head and acted the way an agile business leader is supposed to act. Yeah but… Ok Phil, you say, that’s a nice story but it’s impossible. We can’t START with prediction. That’s too advanced. Back when these maturity models were first developed, I’d say that was true. The accessibility of data science has changed a lot in ten years. We are all more accustomed to talking about models and predictive results. More to the point, as the product evangelist at One Model I can tell you with first-hand confidence that you can, in fact, start with prediction. One Model’s One AI product offering ingests sets of data and runs them through a set of data processing steps, producing predictive models and diagnostic output. Here’s the gory details on all that. Scroll past the image and I’ll explain. Basically there’s a bunch of time consuming work that data scientists have to do in order to generate a model. This may include things like taking a column and separating the data into multiple new columns (One Hot Encoding) or devising a strategy to deal with missing data elements, or checking for cheater columns (a column like “Severance Pay” might be really good at predicting terminations, for example). There’s likely several ways to prepare a data set for modeling. After all that, a data scientist must choose from a range of predictive model types, each of which can be run with various different parameters in place. This all adds up to scrubbing, rescrubbing, running and re-running things over and over again. If you are like me, you don’t have the skill set to do all of that effectively. And you likely don’t have a data scientist loitering around waiting to grind through all of that for you. That’s why in the past this sort of thing was left at the end of the roadmap-- waiting for the worthy few. But I bet you are pretty good at piecing data sets together in Excel. I bet you’ve handled a vlookup or two on your way to becoming a people analytics manager. Well… all we actually need to do is manually construct a data set with a bunch of columns that you think might be relevant to predicting whatever outcome you are looking for. Then we feed the data into One AI. It cycles through all the gnarly stuff in the image above and gives you some detailed output on what it found. This includes an analysis of all the columns you fed in and also, of course, the model itself. You don’t need to be able to do all the stuff in that image. You just need to be able to read and digest the results. And of course, we can help with that. Now, the initial model may not have great precision and recall. In other words, it might not be that predictive but you’ll discover a lot about the quality and power of your existing data. This exercise allows you to scout ahead, actually mapping out where your roadmap should go. If the initial data you got your hands on doesn’t actually predict anything meaningful in terms of unique, differentiating employee outcomes-- then it’s damn good you didn’t discover that after three years of road building. That would be like one of those failed bridges to nowhere. Don’t do that. Don’t make the next phase of your career look like this. Welcome to 2019. We’ve dramatically lowered the costs of exploring the predictive value of your data through machine learning. Get your hands on some data. Feed it into One AI. If it’s predictive, use those results to build your coalition. If the initial results are not overly predictive, scape together some more data or try a new question. Iterate. Be agile. Be smart. Sometimes you have to stand on your head for a better view. How can I follow Phil's advice and get started? About One Model One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.

    Read Article

    8 min read
    Phil Schrader

    Last week I was doodling some recruiting graphs in my notebook, with an eye toward building out some new recruiting efficiency dashboards. I was thinking about how requisitions age over time and I got an idea for a cool stacked graph that counts up how many requisitions you have open each month and breaks them out into age buckets. Maybe some supporting breakouts like recruiter, some summary metrics, etc. Something like this: Phil's Beautifully Hand-illustrated Cholesterol Graph (above) This would be an awesome view. At a glance I could see whether my total req load was growing and I could see if I’m starting to get a build up of really old reqs clogging the system. This last part is why I was thinking of calling it the Requisition Cholesterol Graph. (That said, my teammate Josh says he hates that name. There is a comment option below… back me up here!) But then I got to thinking, how am I actually going to build that? What would the data look like? Think about it: Given: I have my list of requisitions and I know the open date and close date for each of them. Problem #1: I want to calculate the number of open reqs I have at the end of each time period. Time periods might be years, quarters, months, or days. So I need some logic to figure out if the req is open during each of those time periods. If you’re an Excel ninja then you might start thinking about making a ton of columns and using some conditional formulas. Or… maybe you figure you can create some sort of pancake stacks of rows by dragging a clever formula down the sheet… Also if you are an Excel ninja… High Five! Being an Excel ninja is cool! But this would be pretty insane to do in Excel. And it would be really manual. You’d probably wind up with a static report based on quarters or something and the first person you show it to will ask if they can group it by months instead. #%^#!!! If you’re a full on Business Intelligence hotshot or python / R wiz, then you might work out some tricky joins to inflate the data set to include a record or a script count a value for each time the reqs open date is before or within a given period, etc. Do able. But then… Problem #2: Now you have your overall count of reqs open in each period. Alls you have to do now is group the requisitions by age and you’re… oh… shoot. The age grouping of the requisitions changes as time goes on! For example, let’s say you created a requisition on January 1, 2017. It’s still open. You should count the requisition in your open req count for January 2017 and you’d also count it in your open req count for June 2018 (because it’s still open). Figuring all that out was problem #1. But now you want to group your requisitions by age ranges. So back in January 2017, the req would count in your 0 - 3 months old grouping. Now it’s in your > 1 year grouping. The grouping changes dynamically over time. Ugh. This is another layer of logic to control for. Now you’re going to have a very wild Excel sheet or even more clever scripting logic. Or you’re just going to give up on the whole vision, calculate the average days open across all your reqs, and call it a day. $Time_Context is on my side (Gets a little technical) But I didn’t have to give up. It turns out that all this dynamic grouping stuff just gets handled in the One Model data structure and query logic -- thanks to a wonderful little parameter called $Time_Context (and no doubt a lot of elegant supporting programming by the engineering team). When I ran into $Time_Context while studying how we do Org Tenure I got pretty excited and ran over to Josh and yelled, “Is this what I think it is!?” (via Slack). He confirmed for me that yes, it was what I hoped it was. I already knew that the data model could handle Problem #1 using some conditional logic around effective and end dates. When you run a query across multiple time periods in One Model, the system can consider a date range and automatically tally up accurate end of period (or start of period) counts bases on those date ranges. If you have a requisition that was opened in January 2017 and you want to calculate the number of reqs you have open at the end of every month, One Model will cycle through the end of each month, check to see if the req was opened before then and is not yet closed, and add it to the totals. We use this for all sorts of stuff, particularly headcount calculations using effective dates and end dates. So problem one was no problem, but I expected this. What I didn’t expect and what made me Slack for joy was how easily I could also deal with Problem #2. Turns out I could build a data model and stick $Time_Context in the join to my age dimension. Then One Model would just handle the rest for me. If you’ve gotten involved in the database side of analytics before, then you’re probably acquainted with terms like fact and dimension tables. If you haven’t, just think vlookups in Excel. So, rather than doing a typical join or vlookup, One Model allows you to insert a time context parameter into the join. This basically means, “Hey One Model, when you calculate which age bucket to put this req in, imagine yourself back in time in whatever time context you are adding up at that moment. If you’re doing the math for January 2017, then figure out how old the req was back then, not how old is is now. When you get to February 2017, do the same thing.” And thus, Problem #2 becomes no problem. As the query goes along counting up your metric by time period, it looks up the relevant requisition age grouping and pulls in the correct value as of that particular moment in time. So, with our example above, it goes along and says, “Ok I’m imagining that it’s January 2017. I’ll count this requisition as being open in this period of time and I’ll group it under the 0 - 3 month old range.” Later it gets to June 2018 and it says, “Ok… dang that req is STILL open. I’ll include it in the counts for this month again and let’s see… ok it’s now over a year old.” This, my friends, is what computers are for! We use this trick all the time, particularly for organization and position tenure calculations. TL;DR In short, One Model can make the graph that I was dreaming of-- no problem. It just handles all the time complexity for me. Here’s the result in all it’s majestic, stacked column glory: So now at a glance I can tell if my overall requisition load is increasing. And I can see down at the bottom that I’m starting to develop some gunky buildup of old requisitions (orange). If I wanted to, I could also adjust the colors to make the bottom tiers look an ugly gunky brown like in the posters in your doctors office. Hmmm… maybe Josh has a point about the name... And because One Model can handle queries like this on the fly, I can explore these results in more detail without having to rework the data. I can filter or break the data out to see which recruiters or departments have the worst recruiting cholesterol. I can drill in and see which particular reqs are stuck in the system. And, if you hung on for this whole read, then you are awesome too. Kick back and enjoy some Rolling Stones: https://www.youtube.com/watch?v=wbMWdIjArg0.

    Read Article

    9 min read
    Phil Schrader

    We’re back with another installment of our One Model Difference series. On the heels of our One AI announcement, how could we not take this opportunity to highlight it as a One Model difference maker? In preparation for the One AI launch, I caught up with Taylor from our data science team and got an updated tour of how it all works. I’m going to try to do that justice here. The best analogy I can think of is that this thing is like a steam engine for data science. It takes many tedious, manual steps and let’s the machine do the work instead. It's not wizardry. It's not a black box system where you have to point at the results, shrug, and say, “It’s magic.” This transparent approach is a difference in its own right, and I’ll cover that in a future installment. For now though, describing it as some form of data wizardry simply would not do it justice. I think it’s more exciting to see it as a giant, ambitious piece of industrial data machinery. Let me explain. You know the story of John Henry, right? John Henry is an African-American folk hero who, according to legend, challenged a steam-powered hammer in a race to drill holes to make a railroad tunnel. It’s a romantic, heart-breaking story. Literally. It ends with John Henry’s heart exploding from the effort of trying to keep pace. If you need a quick refresher, Bruce Springsteen can fill you in here. (Pause while you use this excuse to listen to an amazing Bruce Springsteen song at work.) Data science is quite a bit easier than swinging a 30 pound hammer all day, but I think the comparison is worthwhile. Quite simply, you will not be able to keep pace with One AI. Your heart won’t explode, but you’ll be buried under an exponentially growing number of possibilities to try out. This is particularly true with people data. The best answer is hiding somewhere in a giant space defined by the data you feed into the model multiplied by the number of techniques you might try out multiplied by (this is the sneaky one) the number of different ways you might prepare your data. Oh, and that’s just to predict one target. There’s lots of targets you might want to predict in HR! So you wind up with something like tedious work to the fourth power and you simply should not do it all by hand. All data science is tedious. The first factor, deciding what data to feed in, is something we’re all familiar with from stats class. Maybe you’ve been assigned a regression problem and you need to figure out which factors to include. You know that a smaller number of factors will probably lead to a more robust model, and you need to tinker with them to get the ones that give you the most bang for your buck. This is a pretty well known problem, and most statistical software will help you with this. This phase might be a little extra tricky to manage over time in your people analytics program, because you’ll likely bring in new data sets and have to retest the new combinations of factors. Still, this is doable. Hammer away. Of course, One AI will also cycle through all your dimensional data for you. Automatically. And if you add factors to the data set, it will consider those factors too. But what if you didn’t already know what technique to use? Maybe you are trying to predict which employees will leave the company. This is a classification problem. Data science is a rapidly evolving field. There are LOTS of ways to try to classify things. Maybe you decide to try a random forest. Maybe you decide to try neural nets using Tensorflow. Now you’re going to start to lose ground fast. For each technique you want to try out, you’ve got to cycle through all the different data you might select for that model and evaluate the performance. And you might start cycling through different time frames. Does this model predict attrition using one year of data but becomes less accurate with two years…? And so on. Meanwhile, One AI will automatically test different types of models and techniques, over different time periods, while trying out different combinations of variables and evaluating the outcomes. In comparison, you’ll start to fall behind pretty rapidly. But there’s more... Now things get kind of meta. HR data can be really problematic for data science. There is a bunch of manual work you need to do to prepare any data set to yield results. This is the standard stuff like weeding out bad columns, weeding out biased predictors, and trying to reduce the dimensionality of your variables. But this is HR DATA. The data sets are tiny and lopsided even after you clean them up. So you might have to start tinkering with them to get them into a form that will work well with techniques like random forests, neural nets, etc. If you’re savvy, you might try doing some adaptive synthetic sampling (making smaller companies appear larger) or principal component analysis. (I’m not savvy, I’m just typing what Taylor said.) So now you’re cycling through different ways of preparing the data, to feed into different types of models, to test out different combinations of predictors. You’ve got tedious work to the third power now. Meanwhile, One AI systematically hunts through these possibilities as well. Synthetic sampling was a dead end. No problem. On to the next technique and on through all the combinations to test that follow. This is not brute force per se-- that actually would introduce new problems around overfitting. The model generation and testing can actually be organized to explore problem spaces in an intelligent way. But from a human vs. machine perspective, yeah, this thing has more horsepower than you do. And it will keep working the models over, month after month. This is steam powered data science. Not magic. Just mechanical beauty. And now that we have this machine for HR machine learning. We can point that three-phase cycle at different outcomes that we want to predict. Want to predict terminations? Of course you do. That’s what everyone wants to predict. But what if in the future you want to predict quality of hire based upon a set of pre-hire characteristics. One AI will hunt though different ways to stage that data, through different predictive techniques for each of those potential data sets, and through different combinations of predictors to feed into each of those models…and so on and so on. You can’t replicate this with human powered data science alone. And you shouldn’t want to. There’s no reason to try to prove a John Henry point here. Rather than tediously cycling through models, your data science team can think about new data to feed into the machine, can help interpret the results and how they might be applied, or can devise their own, wild one-off models to try because they won’t have to worry about exhaustively searching through every other option. This might turn out similar to human-computer partnership in chess. (https://www.bloomreach.com/en/blog/2014/12/centaur-chess-brings-best-humans-machines.html) One AI certainly supports this blended, cooperative approach. Each part of the prediction pipeline can be separated and used on its own. Depending on where you are at in your own data science program, you might take advantage of different One AI components. If you just want your data cleaned, we can give you that. Or, if you already have the data set up the way you want it, we can save you time by running a set of state of the art classifiers on it, etc. The goal is to have the cleaning/preprocessing/upsamping/training/etc pieces all broken out so you can use them individually or in concert. In this way, One AI can deliver value whatever the size and complexity of your data science team, as opposed to an all-or-nothing scenario. In that regard, our human vs. machine comparison starts to break down. One AI is here to work with you. Imagine what John Henry could have done if they’d just given him the keys to the steam engine? Book some time on Phil's calendar below to get your HR data-related questions answered. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own. Our newest tool, One AI, integrates cutting-edge machine learning capabilities into its current platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data. Notable customers include Squarespace, PureStorage, HomeAway, and Sleep Number.

    Read Article

    6 min read
    Phil Schrader

    There will be over 400 HR product and service providers in the expo hall at HR Tech in September. A typical company makes use of 8 - 11 of these tools, some as many as 30. And that is wonderful. I love working in HR Technology. Companies are increasingly free to mix and match different solutions to deliver the employee experience that is right for them. New products come to market all the time. And the entrepreneurs behind these products are pretty consistently driven by a desire to make work better for employees. All that innovation leads to data fragmentation. Better for employees that don't work in HR Operations and People Analytics, that is. Because all that innovation leads to data fragmentation. In your organization, you might recruit candidates using SmartRecruiters in some countries and iCIMS in others. You might do candidate assessments in Criteria Corp and Weirdly. Those candidates might get hired into Workday, have their performance reviews in Reflektive and share their own feedback through Glint surveys. This would not be in the least bit surprising. And it also wouldn't be surprising if your internal systems landscape changed significantly within the next 12 months. The pace of innovation in this space is not slowing down. And the all-in-one suite vendors can’t keep pace with 400 best of breed tools. So if you want to adopt new technology and benefit from all this innovation, you will have to deal with data fragmentation. How do you adopt new innovation without losing your history? What if the new technology isn’t a fit? Can you try something else without having a gaping hole in your analytics and reporting? How will you align your data to figure out if the system is even working? This is where One Model fits in to the mix. We're going to call this One Model Difference your Data Insurance Policy. One Model pulls together all the data from your HR systems and related tools, then organizes and connects this data as if it all came from a single source. This means you can transition between technology products without losing your data. This empowers you to choose which technology fits your business without suffering a data or transition penalty. I remember chatting about this with Chris back at HR Tech last year. At the time I was working at SmartRecruiters and I remember thinking... Here we are, all these vendors making our pitches and talking about all the great results you're going to get if you go with our product. And here's Chris literally standing in the middle of it all with One Model. And if you sign up with One Model, you'll be able to validate all these results for yourself because you can look across systems. For example, you could look at your time to hire for the last 5 years and see if it changed after you implemented a new ATS. If you switched out your HRIS, you could still look backwards in time from new system to old and get a single view of your HR performance. You could line up results from different survey vendors. You'd literally have "one model," and your choice of technology on top of that would be optional. That's a powerful thought. A few months later, here I am getting settled in at One Model. I'm getting behind the scenes, seeing how how all this really comes together. And yeah, it looks just as good from the inside as it did from the outside. I've known Chris for a while, so it's not like I was worried he was BS-ing me. But, given all the new vendors competing for your attention, you'd be nuts if you haven't become a little skeptical about claims like data-insurance-policy-that-makes-it-so-you-can-transition-between-products-without-losing-your-data. So here are a couple practical reasons to believe, beyond the whole cleaning up and aligning your data stuff we covered previously. First off, One Model is... are you ready... single tenant. Your data lives in its own separate database from everyone else's data. It's your data. If you want to have direct database access into the data warehouse that we've built for you, you can have it. Heck, if you want to host One Model in your own instance of AWS, you can do that. We're not taking your data and sticking it into some rigid multi-tenant setup at arms length from you. That would not be data insurance. That would be data hostage-taking. Second, One Model doesn't charge per data source. That would be like one of those insurance policies where everything is out-of-network. With One Model, your systems are in-network. If you add a new system and you want the data in One Model, we'll add the data to One Model. If we don't have a connector, we'll build one. One of our clients has data from 40 systems in One Model. 40 systems. In one single model. In its own database. With no fees per data source. So go wild at HR Tech this fall. It is in Vegas after all. Add all the solutions that are right for your employees. And tell all your new vendors you'll be able to hold them accountable for all those bold ROI-supporting metrics they’re claiming. Because you can put all your data into One Model for all your people analytics. You can see for yourself. And if you swap that vendor out later, you’ll take all your data with you. Just don't wait until then to reach out to us at One Model. We love talking shop. And if you happen to like what you see with One Model, we can have your data loaded well before you get to Vegas. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.

    Read Article

    7 min read
    Phil Schrader

    People often ask us, "What makes One Model different?" Well...there's a lot we could show and tell. We've decided to respond with a series of blog posts covering each and every reason. Read on for more! You can't do this with Tableau. One Model features the most advanced role-based security system of any analytics application. It has to. People data is often the most complex and most sensitive data in an organization. Through 15 years of experience working with People Analytics teams and knowing how they wish to provide access, we built a security methodology that caters for all scenarios and fills the complex gaps that other vendors ignore. One Model allows administrators to define custom security groups and designate fine-grained application permissions to each: Can these users create dashboards or just view them? Can they even filter dashboards? Can they drill down to detail? Can they use the Explore tool? Can they build their own metrics? Can they access a metric in one population but not access it in another without changing roles? At the data layer, One Model group permissions control access at both the column level (which data elements a user can see) and the row level (which records of the data element a user can access): Can they see a given department’s data? Do they have access to compensation metrics? Can they cut those metrics by Grade or Position? If they drill in to that data, can they see person-level detail? Better still, One Model security roles understand each user’s relationship to the data. Who reports to whom, for example. That means you could grant all the leaders in your organization drill-down access to their own team members with a single contextual data rule that follows them through the organization as their data changes. Done. Zero maintenance required. Multiple roles merging together to provide the exact level of access for each user regardless of whether they're a HRBP, executive, or director with complex reporting lines. This is not something you can achieve with tableau, qlik, or any other vendor in our space. They come close but they don't understand the relationship between a user and the data itself, which results in constant role security maintenance -- if the desired access can be achieved at all. Why it matters Most teams have self-service is part or goal of their People Analytics Roadmap. If you want to deliver self-service with HR data, you’ll need to effectively and sustainably manage fine-grained sets of permissions like the ones described above. Here’s a look at what is possible with the right role based security capabilities. Let’s say that you’ve developed an integrated recruiting effectiveness dashboard. Your business leaders, recruiting managers, and HRBPs all have access to this dashboard. Based on aggregate data, your business leader can see that the new candidate assessment is, in fact, doing a great job predicting high performing new hires across the company. She drills into her own team’s details and scans through a few examples. This builds her confidence both in the assessment tool and in the dashboard. She’s likely to come back and use other dashboards in the future. The recruiting manager, looking at the same dashboard, is excited by the overall results, but wants to see if this assessment result is having a negative impact on protected groups of candidates in the hiring process. Given her role, you’ve given her access to aggregate slices of demographic data. She uses dashboard filters to cut the data by gender, age, and ethnicity without having to request a one-off ad-hoc report. She’s ready when the topic comes up in a meeting later that day. She thanks you the next time you see her. The division’s HRBP has similar ideas but her security clearance is more complex. Because her division is split across countries and, due to local laws in one country, she's not allowed to view performance ratings, or conduct age and gender analyses, which are seamlessly unavailable for this population. With this limitation in place, she wants to take things a step further in the One Model Explore tool and analyse a recent change to recruiting practices. She combines assessment results and termination data along with her most recent employee survey results. The results are so interesting that she reaches out to you. “Hey, my termination rates are down. We think we’re making better hires based on this new assessment tool, and employee satisfaction is up as well. These are all good signs, but can you figure out which results are driving the others?” After a cursory analysis, the next step is to prove there is a correlation and quantify its impact with the built-in OneAI machine learning suite. Awesome. Isn’t this scenario why your company funded the program in the first place? Without advanced role-based permissions? Well, you probably know that story already. It starts with a generic, one-size fits all dashboard. The plot thickens with the arrival of ad hoc reporting requests and vlookups. And the story ends with… well… more ad hoc reporting and vlookups. If this is something that excites you, let's talk. Click the button below to schedule some time with a One Model team member. We can answer any specific questions you may have, or just chat about role-based permissions (if that's what you're into). About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.

    Read Article