QUICK FILTERS
Featured
4 min read
Steve Hall
When it comes to People Analytics, the most valuable tool is one that lets you to ask the right questions and explore solutions. Canned insights can't answer the real questions you need to answer. Recently, during a demo with a prospective client, a question came up that perfectly illustrates how One Model is a platform built for problem-solving rather than just offering irrelevant canned insights. The Situation: A Forecasting Challenge The scenario began with a focus on Female Representation metrics, specifically forecasting whether the organization was on track to meet its diversity targets for women. The forecast feature showed trends for different job levels, and while representation looked promising for some levels, there was a noticeable downward trend for the executive level. Naturally, the prospect wanted to know: Why is this happening? This was not a question with an easy, pre-packaged answer. Instead, it required a deeper dive into the data—an approach that highlights One Model's value as a tool for discovery and insight generation. Digging Deeper: How We Tackled the Problem To address the question, we demonstrated how to use filters and visualizations to isolate and explore the data. Here's how it unfolded: Applying Filters: We filtered the data by job level and gender to focus specifically on female executives. From there, we looked at key metrics like net hiring trends and termination rates. Identifying Patterns: The data revealed a significant drop in representation between 2023 and 2024, which appeared dramatic due to the auto-scaling of the graph. Exploring Causes: By clicking through different visualizations, we identified that termination rates, particularly "other" terminations, were higher than expected. Using One Model's hotspot maps, we further pinpointed the specific business unit and region where the issue was most acute. Forming Hypotheses: Using this information, we leveraged One Model's built-in predictive AI capabilities to identify potential turnover drivers and develop actionable insights. Flexibility Matters This scenario underscores something critical about One Model: We don’t solve all your problems; we give you the tools to solve them. Other platforms that rely on rigid, canned use cases might struggle in this situation; no solution can offer pre-built analyses for all possible scenarios. Without a pre-built guide addressing their specific issue in this specific organization, the user will hit a wall. One Model, by contrast, enables users to dynamically filter, explore, and analyze data to uncover answers. Why This is Critical for People Analytics This scenario demonstrates the real-world challenges of People Analytics. Insights are rarely handed to you on a silver platter. Instead, they require a combination of curiosity, exploration, and judgment —qualities not even AI will bring to the table. While some HRBP-level professionals might not engage in this level of analysis, advanced People Analytics practitioners understand that solving complex, niche problems—like representation trends at a specific level—requires more than surface-level data. The One Model Advantage Here’s why One Model is different: Speed: Because One Model creates a unified single source of truth for your organization, you can explore complex interactions without having to manually manipulate data, saving you time. Flexibility: You’re not limited to prebuilt Storyboards or canned content. You can adapt and dig into unique questions in real-time, even in situations where you need to create new metrics to explore an issue. Depth of Insights: By enabling dynamic exploration, One Model allows for nuanced and complete answers that out-of-the-box solutions can’t deliver. The takeaway from this use case is simple: Good insights require effort. Platforms that promise quick, prebuilt solutions often oversimplify problems or deliver incomplete answers. One Model’s strength lies in empowering users to dig deeper and uncover real insights—even when the questions are complex. With One Model, you’re not just using a People Analytics platform—you’re solving real problems.
Read Article
Featured
3 min read
Kelley Kirkpatrick
Workplace gender equality is a critical focus for Australian employers, supported by initiatives like the Workplace Gender Equality Agency (WGEA). Established to promote and improve gender equality in workplaces, WGEA regulations require organisations with 100 or more employees to report gender data across six gender equality indicators. These indicators include key aspects such as pay equity, workforce composition, and representation in management roles. The reporting process is detailed, requiring a combination of point-in-time employee data and aggregate metrics spanning the reporting year. While this mandate enables organisations to reflect on and address gender equality, the process itself is challenging. Teams often spend weeks—or even months—sourcing data from disparate systems, aligning it with WGEA’s strict criteria, and meticulously validating it. However, with the right tools, this complex task can be streamlined with One Model. Tackling the Challenges of WGEA Reporting For most organisations, WGEA reporting is not just about compliance—it’s about leveraging the reported data to foster deeper insights into workforce gender equality. Yet, the process is notoriously cumbersome. Compiling detailed employee information, annualised salary data, and metrics such as leave and movement patterns often requires manual intervention and significant cross-team collaboration. This can lead to inefficiencies, inaccuracies, and limited time to analyse the results. But what if the data collection and validation process could be automated? This is where One Model comes in. Automating WGEA Reporting: How One Model Makes it Simple The One Model People Analytics platform transforms the arduous WGEA reporting process into a streamlined, automated operation. By centralising workforce data, aligning it with WGEA’s submission templates, and enabling robust validation, One Model allows organisations to achieve compliance efficiently while focusing on what matters most: understanding and acting on the insights derived from the data. Customer Spotlight An Australian customer recognised the platform’s potential to simplify their WGEA reporting. Some of the required workforce data was already housed in One Model, but integrating the full dataset—including annualised salary details and movement metrics—was the next step. By ingesting and modelling all the necessary data, One Model provided the customer with: A single source of truth: Data was centralised, validated, and securely accessible to analysts across various teams. Streamlined workflows: Automation reduced the need for manual data manipulation and cross-referencing. Tailored insights: The customer leveraged One Model’s analytics capabilities to create Storyboards and executive reports, turning raw data into actionable insights. The outcome? The customer not only completed their WGEA submission in one day instead of five, but also unearthed valuable insights for their leadership team, who were impressed by the quality and depth of the analysis. Beyond Compliance: The True Benefits of Automating WGEA Reporting Automating WGEA reporting increases efficiency and confidence while shining a light on what’s going well and areas for improvement. Efficiency: Teams save weeks of effort through automation, ensuring submissions are accurate and on time. Data Integrity: By centralising data and applying consistent validation, organisations can trust their numbers. Insights-Driven Culture: Once the reporting is complete, the data can be repurposed to drive conversations around workforce planning, pay equity, and diversity initiatives. With One Model, you can transform a complex compliance process into a streamlined, automated workflow. This not only saves time but also provides valuable insights that drive meaningful progress in workplace gender equality across the country. Ready to simplify your WGEA Reporting?
Read Article
Featured
14 min read
Phil Schrader
As a people analytics leader, you’re going to be confronted with some not-so-simple, horribly open-ended questions: “Hey, so what do you want to measure?” Where should we start?” or… “What HR dashboards should we build?” Perhaps these words have been uttered by a well-intentioned business analyst from IT, peering at you from behind a laptop, eager to get your items added into an upcoming sprint before all their resources get tied up with something else. What do you say? Something really gnarly and fancy that shows your analytic savvy? Something that focuses on a key issue confronting the organization? Something basic? Fear not. In this blog post, we’ll walk you through eight essential people analytics dashboards. You should be able to get the HR data for all of these from your core HRIS or HCM solution, even if they’re in different modules and you have to combine it into one dataset. The key performance indicators (KPIs) in these views will give you the highest impact: Headcount Metrics Dashboard Span of Control Dashboard Employee Turnover Dashboard or Attrition Dashboard Talent Flow Dashboard Career Growth / Promotions Dashboard Diversity (DE&I) Dashboard Employee Tenure Dashboard …see below… 1: Headcount Metrics Dashboard Headcount metrics are the foundation of people analytics. Headcount speaks volumes. Trend it over time, break it out by key groupings, and you are well on your way to doing great people analytics. Here’s an initial view that captures the basics. Here’s what’s included in this dashboard so you can get a handle on headcount. In the upper right, you’ve got what I call the “walking around the number”. It’s not anything that will help you make an informed decision on anything. But this is the stat that you would feel embarrassed if someone asked you and you didn’t know off the top of your head. Here it’s the total number of employees as of the current point in time. (EOP is shorthand for End of Period. Be precise in how you define things. More on this at the end.) Next, you’ll want to see the headcount trended over time. Here we have a monthly trend paired with the same period last year. Boom. Now you can see how things are changing and how they compare with the previous year. Also, these two visuals are a great test run for your existing reporting and analytics capabilities. In the bottom right, here you have headcount broken out by org unit (or business unit, or supervisory org for you Workday types). Here you want not only the total counts but ideally a stacked column view so you can see the proportion of contractors, part-time, co-op, or other employment types. Different orgs might get their work done in different ways. You should know the differences. Finally, a map view of headcount by geography. It’s not a basic visual, but it has certainly become essential. Things happen in the world. You need to know where your workforce is so you can quickly estimate the impact and plan support. In just the past two years, employees have been impacted by wildfires, heat domes, political unrest, blizzards, cold snaps, flooding, and, of course, COVID. Geo maps have officially gone from fancy visual to essential view. 2: Span of Control Dashboard I’m going to change things up a bit by elevating the span of control to the second slot on this list. Don’t worry. We dive into attrition and representation later in the article. As a people leader, you’ve got to maintain some perspective on how efficiently your workforce gets work done. There are many ways to do this. You could calculate the total cost of your workforce. You could align those costs against revenue over time. By all means, do that. But this list is also there to help you get started. With just the data from your core HCM / HRIS system, your team should be able to show you the span of control and organizational layers. These metrics always remind me of stepping on a scale. If your span of control is ticking down, you’re getting less lean. If you’re adding more layers, your internal coordination costs are going up. There could be good reasons for this– but there sure as heck can be bad reasons for this. Here you’ll find your key Span of Control Metrics, your trend over time, and your layers and org units visualized. The real killer metric – if you’ve got the stomach for it – is a simple list of the number of managers in your organization that have only one or two direct reports. Use these views to keep your talent management processes grounded in business reality. If your existing team/technology can’t produce these views then shift them back. 3: Employee Turnover Dashboard or Attrition Dashboard Ok, we can’t go any further without employee turnover. Attrition if you’re feeling fancy. Turnover is the strongest signal you get from your workforce. Someone worked here and– for one reason or another– it didn’t work out. Changing jobs and firing an employee are both major events. Your workforce is telling you something and you need to listen to help you with employee retention. Here’s a basic view to get you started. Again, get your rolling 12-month termination rate up at the top and trend it out with the previous year for context. Below that, you see a breakout of voluntary and involuntary termination rates. Then, you can see breakouts by business unit, location, and org tenure groupings. Now with a glance, you can see how turnover rates are changing, where they are high, and whether it’s you or the employee forcing the change. Learn more how to calculate the cost of turnover. 4: Talent Flow Dashboard Once you’ve got a turnover view squared away, you can move into broader views of talent movement within your organization. Here’s a high-level talent flow view to get started. It leads off with a consolidated view of hires, terms, and net hires trend over time. I love this view because it lends itself to discussions of churn and the cost of turnover. The top area (green) shows external hires. The bottom (red) shows exits/terminations. The dark bars show the difference: net hires. The big question. How much of that time and money that you put into recruiting is just to replace the people who leave the company? A great variation on this view is to limit it to women or underrepresented groups. Are you working hard to attract these demographics, only to have them leave because they don’t find the organization to be a fit for them? We’ll get to more workforce representation views below. Next to the Net Hire Trend, you can mix in a growth metric and a helpful breakout by “business unit, so you can keep an eye on what segments of the organization are growing/shrinking. Are they the ones you expect? Later when you bring in data from other systems like learning, this view can be a place to collaborate with the learning team to answer questions like: Are you adding more employees, when you could be upskilling? Finally, get a solid crosstab view of promotions or movements. This will help you optimize talent development and answer questions like: Do people move from function to function? If so, what are the common paths? What paths don’t exist? Should they? 5: Career Growth / Promotions Dashboard After you get the big picture on movements, dig into promotions. In my mind, the movement and span of control views are about what the organization is experiencing. Promotions put you more in the mind of your employees and what career opportunities look like in your organization. I’ve added two of our key metrics to the top of this one. What’s the rate at which people get promoted and how long is the typical wait for promotion? Once you know the typical (average or median is fine) wait time, keep your ears out for high potential / high performers who have run past that mark. They’re probably keeping a rough estimate of that metric in their minds as well. Below that are two breakout views. The first one - “Manager Hires vs. Promotions to Manager” - is meant to look at a key milestone in career growth. I’ve used promotion to manager, but you might have unique ones. Then for each business unit, I’ve compared the number of promotions into that key group with the number of outside hires in that group. Are you growing your own leaders (or another key group)? If not, why? Filling out the bottom row is the “Termination Rate and Headcount by Time since Last Promotion” view. Look for two things here: 1) Do people leave if they don’t get promoted? 2) Do people leave right after they get promoted? 6: Diversity (DE&I) Dashboard It’s past time we brought in views of the diversity, equity and inclusion (DE&I) in your workforce. Many of the views in the dashboard below are split out versions of the metrics introduced above. Above is a sample diversity dashboard using male / female breakouts. Use this as a template for other representation breakouts including ethnicity, gender identity, age, etc. Any of these views could be modified to incorporate multiple, rather than just two, groupings. The top bar shows activity differentials over time. Hires are done simply as counts. Do you hire more men than women? Are promotions and terminations handed as rates to monitor for disproportionate outcomes?, i.e. are men promoted more often than women? The second row shows representation by key grouping in stacked horizontal bars. I like organizational layer and salary band to show if high career outcomes are disproportionate. I’d recommend the inclusion of tenure as well, however. If your organization had a history of disproportionate staffing, you will get a clue in this view. That could explain why today’s initiatives have not yet balanced out outcomes in level or pay. Or differences in tenure might be explained by differences in termination rates, depicted directly above in this view. This is a multifaceted issue. 7: Employee Tenure Dashboard Confession. I love tenure. I’ve come of age in my career amid data telling me that I’ll work for something like 11 companies before I retire. And, to be honest, I’ve done my share of career hopping. But it turns out that when you stick around somewhere, you learn things. You make connections with your co-workers. Employee tenure represents the accumulation of invaluable knowledge and connections that help you measure the value of your human capital. Next to average tenure, this dashboard shows the total accumulated workforce tenure in years. While not exactly a “walking around number,” you can use this to impress your fellow leaders into thinking about your workforce like the treasured asset it is. “Hey, our team has x millennia of accumulated experience!” Rounding out this view is a sorted view of positions or job titles with lots of accumulated experience as well as a stacked trend over time to see how tenure groupings are changing. 8: Dashboard Definitions and Details This final section is not a specific dashboard suggestion. Rather, it’s intended as a sobering reminder that none of the dashboards above will make an impact in your organization if you can’t explain your logic and build trust in your data. I like to build little glossary style views right into the dashboards I create. For example, at the bottom of our standard attrition storyboards, I’ve added breakouts showing which termination reason codes are included as voluntary and which are involuntary. Next to my glossary, I’ve created a table that breaks out the subcomponents of turnover rate, such as total headcount and days in period. I like to include at least one leap year for a bit of showmanship. “Look, I’ve even accounted for the fact that 2020 had 366 days, so back off.” Ready To Learn More? Get All Your Questions Answered One-on-one. Finally, if your security models and technology support it, drill to detail. This is the number one, all-time champion feature of people analytics. Click on headcount, terminations, whatever and see the actual people included in the data. Bonus points for adding the definition and “bread crumb trail” for metrics that build off of other metrics. Below is a view of how we do that in One Model. If you’d like to see these people analytics dashboards in action or learn more about people analytics software for your organization, reach out to us!
Read Article
Featured
6 min read
Dennis Behrman
Anyone who analyzes data knows there's always a need to drill into reports to answer the questions that pop up. One Model is the only people analytics platform that allows you to drill into literally anything and everything, as if your siloed enterprise data sources were a single source of truth. Working with Metrics, Dimensions, and Time Explore is a powerful tool designed to help you perform powerful ad-hoc analysis on your One Model storyboards, reports, and visualizations. Here's a quick look at how the Explore tool works. Select Your Metrics Metrics are quantifiable measures used to understand the results or outcomes that you observe in your business. The Explore tool presents the entire list of metrics available to you based on your organization's metrics library and the access permission associated with your user profile and your group/team membership. You can add and remove metrics by dragging and dropping them from your metrics library to your metrics selection fields. Your metrics library contains all of the direct and derived values that are used to tell the stories hidden within your data. To learn more about how metrics are established in your One Model instance, check out this article or this video. (You may need to login with your One Model account to view Help Center content.) Pick Your Dimensions Dimensions are attributes or categories by which data can be grouped. Dimensions organize data into meaningful sections for comparing. For example, in a turnover report, dimensions could include rank, business unit, performance rating, and so on. To sub-group your data even further, you might want to add pivoted dimensions, which help you compare groups by more than one attribute. One Model's Explore tool allows you to drag and drop any number of dimensions into your report to see an analytical picture with more detail. Mind Your Time Model Time modeling is perhaps the trickiest and most important activity that happens on the One Model platform. Since time is a constant, your data analysis depends on the most comprehensive coverage of observable and measurable events for analyzing data over different periods. In theory, time subdivides infinitely. But in practice, most analysts and decision makers prefer to view time within a standard set of available lenses such as days, months, quarters, and years. But since months, quarters, and years can have different numbers of days within them, it is critical to getting time right to understand your business in the most accurate way possible. It's important for these cumulative measures to "add up" or "sum to the right number" when aggregated (or drilled through) at scale. It's equally important for data about events to be captured at various time intervals. For example, a group of employees who are currently high-performing rock stars may inform a decision today about high performers. But in reality, many of those rock stars may have been groupies in the past. One Model has no peer when it comes to the most effective application of time series analysis. Here's why. I created the Sankey diagram below with fake data to show a point. Observe how none of the more than 4000 high performers at the end of 2021 remained high performers at the end of 2023. So any analysis conducted in 2024 that uses the pool of 2023's high performers to infer multi-year trends would be an incomplete and possibly flawed analysis of the company's high performers. Most other approaches don't account for the question of "how it looked" in the past. Explore Explore's Unrivaled Speed to Insight Your organization needs the most accurate and current information to make the most informed talent decisions. The Explore tool is one of many keys to telling the stories within your data. Approachable & Intuitive One Model's Explore tool features a professional-class user interface designed to cater to both casual and highly technical users. This balanced design ensures that casual users can easily navigate and utilize the tool without feeling overwhelmed, while technical users have access to advanced functionalities and customization options. The interface’s adaptability fosters a productive environment for all users, enabling them to swiftly uncover insights and make data-driven decisions. Consistent Metrics Definitions Paired with Flexible Dimensional Pivots The Explore tool ensures the consistent application of organization-wide metrics definitions and offers the flexible application of dimensions, enabling users to tailor analyses to their specific needs. By presenting a cohesive and accurate picture of organizational data, the Explore tool enables faster and more reliable insights, accelerating the overall time-to-insight. Better, Faster Insights to More Decision-Makers Around Your Organization One Model's Explore tool excels in its ability to deploy sound insights to any team or decision maker within an enterprise. By seamlessly integrating with various data sources and offering robust reporting features, the tool ensures that actionable insights are readily accessible to all relevant stakeholders. No other people analytics platform drives more data-driven decision-making better than One Model, thanks to tools like Explore, which empower organizations to make informed decisions quickly and efficiently. Essential Questions to Ask When Selecting an AI-Powered HR Tool Learn the right questions to ask to make the right decisions as you explore incorporating AI in HR.
Read Article
Featured
9 min read
Steve Hall
In organizational management, span of control plays a key role in defining how streamlined and agile a company can be. Understanding the Span of Your Manager-to-Employee Relationships At its core, span of control refers to how many people a manager or supervisor directly oversees. The optimal number depends on a variety of factors including job type and job level, and most organizations set targets using rules of thumb and experience. The span of control metric helps determine if the organization is structured appropriately, with too large a span of control leading to ineffective management and manager burnout, and too small a span of control leading to inefficiency. To calculate your average span of control, divide the total number of direct reports by the total number of supervisors. For instance, if there are 100 direct reports to 10 supervisors, the average span of control is 10. Exploring the 2 Types of Span of Control In the context of organizational structure, span of control is classified as either wide or narrow. Each type presents unique advantages and challenges, so it is not a one-size-fits-all proposition. The choice between a wide and narrow span of control depends on various factors, including: The nature of the organization's work and its structural preferences Industry norms Complexity of tasks Managerial capacity Job level Both wide and narrow spans have their place, even across departments and job levels within an organization. The key is to find a balance that maximizes efficiency, promotes effective management, and aligns with the organization's overall goals. Wide Span of Control In a wide span of control, a single manager supervises many subordinates. This structure is often seen in companies with flat organizational structures, with fewer layers between the top and bottom levels and a shorter chain of command. Wide structures are also more common at lower levels in organizations. Features: Low supervision overhead costs Prompt response from employees Improved coordination Suitable for repetitive or low-skill tasks Advantages: Encourages delegation of authority Facilitates better manager development Ensures clear policies Promotes autonomy among subordinates Fewer levels in the managerial structure Cost-effective Suitable for larger firms and repetitive tasks Well-trained subordinates Disadvantages: Risk of supervisors being overburdened Potential loss of control for superiors Need for highly qualified managing employees Hindered decision-making Increased workload for managers Unclear duties for team members Confusion among subordinates Management challenges in large teams Reduce manager-employee interactions Narrow Span of Control Conversely, a narrow span of control is characterized by a manager overseeing a smaller number of subordinates. This approach is prevalent at the top or middle management levels, especially when tasks are complex and require more support from superiors. Features: Ideal for new managers to gain supervisory experience Beneficial for managing remote or diverse teams Necessary for jobs requiring frequent manager-employee interactions Useful in new operations and for employee training Advantages: Easier communication and management in small teams High specialization and labor division Better opportunities for staff advancement Direct supervision by managers over each subordinate Effective communication between subordinates and managers More layers in the management structure for easier control Improved management control and effective supervision Disadvantages: The potential of stifling of employees' creativity due to excessive manager control Slower decision-making in extended hierarchies Limited cross-functional problem-solving Higher costs due to more managerial positions Delays in information transmission and decision-making The Challenge of Manual Span Management Effective span management is a balancing act, nearly impossible to achieve without technology. Strong span management requires examining spans vertically, horizontally, and over time; this creates a complex situation that is not easily or effectively handled without well-orchestrated data. Span Management Impacts A high manager-to-employee ratio might lead to insufficient attention to each team member, potentially affecting employee development and performance. Conversely, a low ratio could indicate inefficiencies and a bloated organizational structure that erodes profitability. Span Management in Different Industries Span management requires a tailored approach, as the ideal ratio varies by industry and job function. In labor-intensive industries, a higher ratio is often more manageable, whereas in knowledge-based sectors, a lower ratio might be preferable to ensure quality supervision and mentorship. Seasonal Staffing Certain industries or departments may experience fluctuations in workload at different times of the year, necessitating a flexible approach to span management. During peak seasons, managers may need to handle more direct reports or delegate responsibilities more effectively, while in slower periods, they may focus on training and development. A dynamic strategy can maintain efficiency without compromising the quality of supervision or employee growth. The Role of HR and Analytics in Span of Control Human Resources plays a critical role in monitoring and adjusting the span of control. HR can track this metric in real-time by using analytics tools to help maintain an optimal balance. People analytics software like One Model offers capabilities to analyze and adjust management span of control across various levels and departments, ensuring organizational efficiency and employee satisfaction. Data-Driven Span of Control Analysis Span of control analyses help organizations identify optimal structures and make precise staffing decisions in response to changes over time. Using people analytics tools, HR can dissect span of control across different dimensions such as department, geography, and manager level. Analysts should examine span of control: Both vertically and horizontally, and over time Relative to gross and net revenue Relative to employee-related outcomes such as engagement and retention It is not practical or effective to evaluate and manage span of control manually; this is an area where robust data can be used to drive effective decision making and optimize outcomes. However, to kickstart this analysis, even basic data from a core HCM or HRIS system can be enlightening. Metrics like span of control and organizational layers are akin to stepping on a scale — they provide immediate feedback on the state of your organizational structure. Within this discussion, key metrics such as span of control trends and visualization of layers and organizational units are invaluable. One crucial metric, for instance, is the number of managers with only one or two direct reports. This simple statistic can reveal much about the nature of your management structure. These insights are essential for keeping talent management processes aligned with business reality. If your current team or technology cannot readily provide these views, it may be time to reconsider your approach and tools. It took our team under 5 minutes to find the ratio between managers and non-managers. How long will it take your team to answer Question #38 on the People Analytics Challenge? Setting Targets for Span of Control Setting the right targets for span of control involves considering various factors, including industry norms, organizational structure, and management levels. A higher ratio may be effective for frontline or production roles, while senior management might require a lower ratio to strategize and lead effectively. Organizations often set their span of control targets based on industry benchmarks, aiming for a median that balances efficiency and managerial attention. Variations in span of control targets can be set for different organizational units, such as contact centers, corporate offices, and field operations. But the best organizations strive to surpass industry norms and link span of control metrics with outcomes of interest such as efficiency, profitability, employee engagement, and voluntary turnover. By doing so, they can optimize span of control to drive desired outcomes. Mastering Span of Control with One Model Understanding and effectively managing the span of control is crucial for any organization seeking to optimize its structure for maximum efficiency and employee development. With One Model, organizations can gain the insights needed to make informed decisions about their management structures, ensuring they are well-equipped to adapt to changing market demands and internal growth dynamics. One Model also supports next-level span-of-control analytics by allowing organizations to link span-of-control with operational metrics, moving the organization from descriptive analytics into the realm of optimization. After all, blindly following industry benchmarks won't ensure optimization within the organization. One Model is equipped to support optimization through the modeling core HRIS data, employee engagement data, employee performance data, and operational data related to production, safety, and financial outcomes. If you aren’t using a tool to measure and track span of control, you’re missing out. If you aren’t linking span of control to business metrics that matter, you’re really missing out.
Read Article
Featured
10 min read
Phil Schrader
Succession planning is a strategic HR function. Its purpose is to map out key positions in the organization and identify potential successors who are (or will be) ready to step into those key positions when they become vacant. Organizations with effective succession planning programs are more resilient. When a critical role is vacated, they already know who can step up and fill the role. Succession management also boosts employee motivation because they can see a path forward within the organization. Strategic HR activities like this go hand in hand with People Analytics. In order to effectively plan for the future, you need clarity around what you want to accomplish and whether you are improving.. Metrics help you create that clarity. How many of our plans have successors? How ready are they? What’s our bench strength? Are our successors representative of the wider talent pool? So let’s dig in and talk about that union of strategy and analytics. How do you measure your succession plan readiness, and what are the key metrics for succession planning and leadership development? Measuring Succession Planning First, here is an "oldie but a goldie" video walking through the succession planning process. Second, here are the key elements of measuring succession planning. Scope: What are the critical roles that require identified successors. Ideally, your program covers all non-entry level roles, but time is scarce so prioritize. Coverage: Given the scope above, do you have plans set up for all critical roles? Readiness: Have you evaluated each successor’s readiness for each plan they are in? Remember that one person might be a successor for multiple positions, and they might be more ready for some roles than others. Readiness can be categorized in high-level groupings. For example, “Ready Now”, “Ready in < 1 Year”, and “Ready in > 1 Year”. Bench Strength: Given completeness and individual readiness, how strong is your bench? Can you fill all critical roles? Is it still strong if you net out the successors, i.e. account for people who are selected in multiple plans. Diversity: Does your plan make full use of the available talent in the organization? Have historical tendencies caused you to overlook strong successors because they have different backgrounds and experiences from the incumbents? Will your leadership ranks become more or less diverse when your plans move into action? It took me 44 minutes and 56 seconds to pull together the metrics above to answer Question #25 from the People Analytics Challenge. Let me show you the full Succession Dashboard. Connect with us today! Key Metrics (with Definitions) Here are the key metrics you can use to address the strategic questions above. Percent of Leaders with a "Ready Now" Successor Bottom line. What does your successor coverage look like right now? Count up the number of leaders who have a successor that is ready now. Divide that count by the total number of roles in your succession planning program (see Scope above). For example, if you have 10 positions that you’ve identified as needing a successor and you have a ready now successor for 7 of those roles, then your percentage of leaders with a ready now successor is 70%. Now flip that number around and say to yourself, “Ok if one of our really key people left today, there’s a 30% chance that we’d have no one ready to take over that position.” Don’t let that be you. Use the detailed data from this calculation to create an operational list of the positions without a successor. Then work the list! Gross and Net Bench Strength The first metric tells you how ready you are to move on from one key person. Gross and Net Bench strength give you a sense of how resilient your organization would be in the face of multiple changes. Technical note: These calculations will assume that your program has set out to have 3 successors identified for each key role. Gross Bench Strength: Total successors divided by total successors needed, ignoring whether the successors are used in multiple plans. Net Bench Strength: Total successors divided by total successors needed, only counting each successor once. i.e. taking into account whether the successors are used in multiple plans. So let’s look at these calculations together. Let’s say you have 10 key roles and you have determined that you should have 3 successors for each. That means your total successors needed is 30. Now go through your plans and add up all the listed successors. Perhaps you have 26. That means you have 26 successors out of the 30 you need making a gross bench strength of 87%. Awesome. Ok. Now let’s get more nuanced. Let’s deduplicate the list of successors. Maybe there are 2 high potentials in that pool who are listed on all 10 plans. Extreme example but useful for our illustration. That means that there are really only 8 unique successors. That makes your net bench strength 8 / 30 or 27%. This difference between a gross bench strength of 87% and a net bench strength of 27% tells you that you have good immediate coverage but low resiliency. You can effectively respond to 1 or 2 people leaving, but beyond that, your bench will be depleted. Incumbent vs. Successor Diversity % Generally speaking, today’s organizations are looking to take full advantage of their available talent by ensuring that traditionally underrepresented groups are considered for advancement. A simple way to check on this progress is to compare the representation numbers of your incumbents to the representation of your successors. Let’s suppose the current pool of employees in key roles is 10% diverse while your pool of successors is 20% diverse. This is a signal that your succession planning process will contribute to greater diversity in your key positions in the future. Remember to align your successor diversity metrics with the key groupings defined by your organization’s DE&I program. These could include gender, ethnicity, or other employee attributes. Promotion Rate and Time on Bench If you make progress on the metrics above, then you’ll be leading your organization into a more resilient future. Good job! But remember, resilience is great for the organization, up to a point. Remember that the high potential employees in your plans have their own career goals. If they feel stuck on the bench, they’re likely to find their next role outside the company. If you are so resilient that you could back up all your key leaders for the next 25 years, then you are fooling yourself. Those high potential employees listed on your plans will be long gone by then. So keep an eye on the promotion rate of your internal candidates over time. (Number of promotions / average headcount). They’ll be making their own estimates as well. Alternatively, you might calculate the time on bench for your successors. When one of your successors leaves the company, check to see if they were on the bench too long. Or just ask them in your exit interview. Pay particular attention to the time on bench for your diverse successors. It’s not enough to say, “Look at how diverse our bench is!” if those candidates are continuously passed up for the next big job. Using Successor Metrics to Support People Strategies The metrics above are just a starting point. The key to strategic HR and people analytics is a willingness to ask important questions and use data to answer those questions. Ideally your succession planning process fits into a larger talent management vision that is supported by a wide range of interconnected datasets and measures. For example, you may be ready to fill key roles with external candidates. Your time to fill for similar positions will help you know if that’s a reasonable backup strategy. Alternatively, your employee pulse survey data and turnover by attrition analyses may indicate that you are having a hard time retaining diverse employees. Perhaps this will link back to the time on bench calculations discussed above. You are unlikely to find meaningful answers in a single data source, so invest in building the right underlying data architecture to connect data from succession plans, core HR, recruiting, engagement, compensation, and other workforce data. At the same time, keep the strategic focus in mind so that you’re not just doing analytics for analytics sake. Come back to the important questions like, “If we lost someone in a key role today, what’s the percent chance we’d be totally flat footed with no idea how to replace them?”
Read Article
Featured
6 min read
John Carter
When examining the workforce dynamics of an organization, it's common to fixate on revenue-generating roles. After all, these positions are directly responsible for bringing in profits. However, focusing solely on revenue-centric roles leaves out a significant chunk of the workforce: the non-revenue employees. The Role of Non-Revenue Employees While non-revenue employees might not directly contribute to the financial bottom line, their contributions are foundational to the organization's success. They constitute the vast business “machinery” that powers the organization, supports revenue-generating roles, and ensures smooth business operations. In fact, they can represent more of your workforce. These include roles in HR, IT, administration, and many other indirect revenue employees who maintain the infrastructure of a business. Non-revenue units keep the operations of a business running. Imagine a product-based company without a logistics team to ensure timely deliveries or a multinational enterprise without HR personnel to manage its vast workforce. The value of non-revenue-producing departments becomes clear when you consider the chaos that would ensue in their absence. Non-revenue employees often introduce efficiency, stability, and scalability into an organization. They identify bottlenecks, streamline processes, and ensure that the revenue-generating departments can operate at peak productivity. Indirect revenue employees may not directly contribute to sales, but they directly influence revenue by performing at a high level of customer satisfaction, meeting or exceeding CSAT goals, reducing churn and creating referenceable champion customers. It took me 10 minutes and 15 seconds to create this breakout. Want to see me do it live? Fill out the form, and let’s connect our teams. The Value of Non-Revenue Units in People Analytics While non-revenue-generating (NRG) roles may not directly influence the new sales revenue stream, they are foundational to an organization's long-term success. Here's why: Holistic Workforce Analysis: An organization only gets a skewed view of its workforce by concentrating on revenue-producing roles. People analytics should consider every layer and department to ensure a balanced strategy for talent acquisition, retention, and development. Reducing Churn in Non-Revenue Departments: Turnover in non-revenue producing departments can be just as detrimental as in sales or business development. For instance, frequent changes in the support and client services roles leads to a loss of inherent knowledge, long ramp up times and loss of confidence with customers reflecting in low CSAT scores, while turnover in HR can impact talent management strategies. Organizations can reduce churn, stabilize operations, and indirectly boost revenue by applying people analytics to these non-revenue units. Identifying Opportunities for Upgrading Skills: As businesses evolve, the roles of non-revenue employees change. People analytics can help identify the need for new skills or training in these non-revenue units, find employees with the skills already and utilize those people, ensuring they continue to support the company effectively and saving money in the long term (training and recruitment costs). The dilemma often faced revolves around headcount — is it worth investing in these indirect revenue employees? The perceived short-term pain of increasing payroll for NRG employees often becomes a deterrent. As leaders, it's tempting to don many hats, especially with constrained budgets. But in doing so, are leaders truly optimizing their own roles? An organization's head, tasked with vision, direction, and often direct revenue-generation through donations, can get tangled in the intricacies of non-revenue units, thereby diluting their effectiveness. The Opportunity Cost with Non-Revenue Departments Convincing a board to hire for NRG roles, especially in medium or smaller organizations, is not straightforward. How you frame the argument is key. One approach is the opportunity cost perspective. By calculating an executive director's (ED) hourly pay and then juxtaposing that against time spent on non-revenue-producing department tasks, organizations can discern the real costs. For instance, if an ED earning $70,000 annually spends 10 hours weekly on tasks better suited for an NRG role, that's an annual cost of $17,498. If reallocating those 10 hours could generate more than this amount, it’s a stronger case for hiring specialized staff. While it's not always as black and white, this method provides tangible metrics, bridging the gap between HR and finance in understanding the worth of non-revenue employees. Ultimately, the emphasis should be on the organization's health and growth. While NRG roles might not bring in direct revenue, their contribution allows revenue-generating sectors to flourish. The Future of Non-Revenue Employees in Business Strategy The line between revenue-generating roles and non-revenue employees is blurring. As businesses increasingly adopt interdisciplinary strategies, the contributions of non-revenue units become more intertwined with revenue outcomes. For example, an effective marketing campaign (often considered a cost center) can significantly boost sales, making it an indirect revenue employee function. The bottom line? While the spotlight often shines brightest on revenue-generating roles, the silent machinery of non-revenue employees is what keeps a business thriving. It's time we acknowledge the importance of non-revenue producing departments and give them the attention they deserve in our people analytics endeavors. Want to see if your people analytics team can answer the top questions asked of HR as fast as us? Download the people analytics challenge!
Read Article
Featured
10 min read
Joe Grohovsky
John Sumser, one of the most insightful industry analysts in HR, recently wrote an article providing guidance on the selection of machine learning/AI tools. That article is found HERE, and can serve as a rubric for reviewing AI and predictive analysis tools for use in your people analytics practice or HR operations. Much of our work day is filled with conversations regarding the One Model tool and how it fits into an organization's People Analytics initiative. This is often the first practical exposure a customer contact has using Artificial Intelligence (AI), so a significant amount of time is invested in explaining AI and the dangers of misusing it. Good Questions to Ask About Artificial Intelligence Solutions - And Our Answers! Our product, One AI, delivers a suite of easy-to-use predictive pipelines and data extensions, allowing organizations to build, understand, and predict workforce behaviors. Artificial Intelligence in its simplest form is about automating a decision process. We class our predictive modeling engine as AI because it is built to automate the decisions usually made by a human data scientist in building and testing predictive models. In essence, we’ve built our own automated machine learning toolkit that rapidly discovers, builds, and tests many hundreds of potential data features, predictive models, and parameter tuning to ultimately select the best fit for the business objective at hand. Unlike other predictive applications in the market, One AI provides full transparency and configurability, which implicitly encompasses peer review. Every predictive output is not only peered reviewable within a given moment of time but also for all time. This post will follow a Q&A style as we comment on each of John’s 12 critical questions to ask an artificial intelligence company. 1) Tell me about the data used to train the algorithms and models. Ideally, all data available to One Model is used for feeding the machine learning engine - the more the better. You cannot overload One AI because it is going to wade through everything you throw at it and decide which data points are relevant, and how much history it should use, and then select, clean, and position that data as part of its process. This means we should feed every single system we have available into the engine from the HRIS, ATS, Survey, Payroll, Absence, Talent Management - everything and the kitchen sink as long as we’re ethically okay with its potential use. This is not a one size fits all algorithm; each model is unique to the customer, their data set, and their target problem. The content of training data can also be user-defined. Users define what type of data is brought into the modeling process, choosing which variables, filters, or cuts will be offered. At any time if users want to specify how individual fields will be treated, they have the ability to do so with the same types of levers as you would have in creating your own model externally. 2) How long will it take for the system to be trained? The scope of data and the machine learning pipeline determine training time. The capacity to create models is intrinsically available in One AI and training can take anywhere from 5 minutes to 20+ hours. For example, we automatically schedule re-training a turnover prediction model for a 15k employee-customer in the space of 45 minutes. 3) Can we make changes to our historical data? Yes, data can be set to be held static or use fresh data every time the model is trained. One AI acts as a data science orchestration toolkit that automates the data refresh, training, build and ongoing maintenance of the model. Models are typically scheduled to potentially refresh on a regular basis e.g. monthly. With every run extensive reports are created, time-stamped, and logged so users can always return to summary reports of what the data looked like, the decisions made, and the performance of the model at any given time. 4) What happens when you turn it off? How much notice will we receive if you turn it off? One AI models and pipelines are completely persisted. They can be turned on and off with no loss of data or logic. We are a data science orchestration toolset for building and managing predictive models at scale. Is AI being offered in a solution for your HR Team? Download our latest whitepaper to get the questions you should ask in the next sales pitch when someone is trying to sell you technology with AI. 5) Do we own what the machine learned from us? How do we take those data with us? Yes, customers own the results from their predictive models, and those results are easily downloaded. Results and models are based upon your organizations data. One Model customers only see their own results, and these results are not combined with other data for any purpose. All the decisions that the machine made to select a model are shown and could be used to recreate the model externally as well. 6) What is the total cost of ownership? Predictive modeling, along with all features of our One AI product, are inclusive within the One Model suite subscription fee. 7) How do we tell when the models and algorithms are “drifting”? Each predictive model is generated and its results are fully transparent. Once a One AI run is finished, two reports are generated for review: Results Summary – This report details the model selected and its performance. Exploratory Data Analysis – This report details the state of the data that the model was trained on so users can determine if the present-state data has changed drastically. Models are typically scheduled to be re-trained every month with any new data received. The new models can be compared to the previous model using the output reports generated. It is expected that models will degrade over time and they should be replaced regularly with better performing models incorporating recent data. This is a huge burden on a human team, hence the need for data science orchestration automating the manual process and taking data science delivery to scale. 8) What sort of training comes with the service? One Model’s customers are trained on all aspects of our People Analytics tool. Training is offered for non-Data Scientists to be able to interpret the Results Summary and Exploratory Data Analysis reports so they can feel comfortable deploying models. A named One Model Customer Service Manager is available to aid and provide guidance if needed. 9) What do we do when circumstances change? One AI is built with change in mind. If the data changes in a way that breaks the model or the model drifts enough that a retrain is necessary, users can restart the automated machine learning pipelines to bring in new data and create a new pipeline. The new model can be compared to the previous model. One AI also allows work to occur on a draft version of a model while the active model is being run in production. 10) How do we monitor system performance? The Results Summary and Exploratory Data Analysis charts provide extensive model performance and diagnostic data. Actual real-world results can be used to assess the performance of the model by overlaying predictions with outcomes within the One Model application. This is also typically how results are distributed to users through the main analytics visualization toolsets. When comparing actual results against predictions, One Model cautions users to be aware of underlying data changes or company behaviors skewing results. For example, an attrition model may identify risk due to an employee being under-trained. If that employee is then trained and chooses to remain with the organization, then the model may have been correct but because the training data changed results can’t really be compared. In the case of this employee their risk score today would be lower than their risk score from several months ago prior to training. The action to provide additional training may indeed have been a response from the organization to address the attrition risk, and actions like these that are specifically made to address risk must also be captured to inform the model if mitigation actions have taken place. The Results Summary and Exploratory Data Analysis reports typically build enough trust in cross-validation that system performance questions are not an issue. 11) What are your views on product liability? One AI provides tooling to create models along with the reports for model explanation and interpretation of results. All models and results are based exclusively on a customer’s own data. The customer must review the model’s results and choose to deploy and how they use those results within the organization. We provide transparency into our modeling and explanations to provide confidence and knowledge of what the machine is doing and not just trusting a black box algorithm is working (or not). This is different from other vendors who may deliver inflexible canned models that were trained on data other than the customers or are inflexible to use a unique customer data set relevant to the problem. I would be skeptical of any algorithm that cannot be explained or its performance tracked over time. 12) Get an inventory of every process in your system that uses machine intelligence. Each One Model customer decides how specific models will be run for them, and how to apply One AI. These predictive models typically include attrition risk, time to fill, promotability, and headcount forecast. Customers own every model and the result generated within their One Model tool. One AI empowers our customers to combine the appropriate science with a strong awareness of their business needs. Our most productive One AI users utilize the tool by asking it critical business questions, understanding all relative data ethics, and providing appropriate guidance to their organization. If you would like to learn more about One AI, and how it can address your specific people analytics needs, schedule some time with a team member below.
Read Article
Featured
10 min read
Phil Schrader
The One Model difference that really sets us apart is our ability to extract all your messy data and clean it into a standardized data catalog. Let's dive deeper. One Model delivers people analytics infrastructure. We accelerate every phase of your analytics roadmap. The later phases of that roadmap are pretty fun and exciting. Machine learning. Data Augmentation. Etc. Believe me, you’re going to hear a ton about that from us this year. But not today. Today we’re going to back up for a minute and pay homage to an absolutely wonderful thing about One Model: We will help you clean up your data mess. Messy Data? Don't distress. Josh Bersin used this phrasing in his talk at the People Analytics and the Future of Work conference. From my notes at PAFOW on Feb 2, 2018: You know there are huge opportunities to act like a business person in people analytics. In the talk right before Josh’s, Jonathan Ferrar reminded us that you get $13.01 back for every dollar you spend on analytics. But you have to get your house in order first. And that’s going to be hard. Our product engineering team at One Model has spent their careers figuring out how to pull data from HR systems and organizing it all into effective data models that are ready for analytics. If your team prefers, your company can spend years and massive budgets figuring all this out... Or, you can take advantage of One Model. When you sign up with One Model: 1) We take on responsibility for helping you extract all the data from your HR systems and related tools. 2) We connect and refine all that data into a standard data catalog that produces answers your team will actually trust. Learn what happened to Synk when they finally had trust. Big data cleansing starts with extracting the data from all your HR and related tools. We will extract all the data you want from all the systems you want through integrations and custom reports. It’s part of the deal. And it’s a big deal! For some perspective, check out this Workday resource document and figure out how you’ll extract your workers’ FTE allocation from it. Or if Oracle is your thing, you can go to our HRIS comparison blog and read about how much fun our founder, Chris, had figuring out how to get a suitable analytics data set out of Fusion. In fact, my coworker Josh is pulling some Oracle data as we speak and let me tell you, I’m pretty happy to be working on this post instead. Luckily for you, you don’t need to reinvent this wheel! Call us up. We’ll happily talk through the particulars of your systems and the relevant work we’ve already done. The documentation for these systems (for the most part) is out there, so it’s not that this is a bunch of classified top-secret stuff. We simply have a lot of accumulated experience getting data out of HR systems and have built proprietary processes to ensure you get the most data from your tools. In many cases, like Workday, for example, we can activate the custom integration we’ve already built and have your core data set populated in One Model. If you go down that road on your own, it’ll take you 2 - 3 days just to arrange the internal meeting to talk about how to make a plan to get all this data extracted. We spent over 10,000 development hours working on our Workday extraction process alone. And once you do get the data out, there’s still a mountain of work ahead of you. Which brings us to... The next step is refining your extracted data into a standardized data catalog. How do you define and govern the standard ways you are going to analyze your people data? Let’s take a simple example, like termination rate. The numerator part of this is actually pretty straightforward. You count up the number of terminations. Beyond that, you will want to map termination codes into voluntary and involuntary, exclude (or include) contractors, etc. Let’s just assume all this goes fine. Now what about the bottom part? You had, say 10 terminations in the given period of time, so your termination rate is... relative to what headcount? The starting headcount for that period? The ending headcount? The average headcount? How about the daily average headcount? Go with this for two reasons. 1) It’s the most accurate. You won’t unintentionally under or overstate termination rate, giving you a more accurate basis of comparison over time and the ability to correctly pro-rate values across departments. See here for details. And 2) If you are thinking of doing this in-house, it’ll be fun to tell your team that they need to work out how to deliver daily average headcounts for all the different dimensions and cuts to meet your cleaning data requirements. If you really want to, you can fight the daily average headcount battle and many others internally. But we haven’t even gotten to time modeling yet, which is so much fun it may get its own upcoming One Model Difference post. Or the unspeakable joy you will find managing organizational structure changes, see #10. On the other hand, One Model comes complete with a standard metrics catalog of over 590 metrics, along with the data processing logic and system integrations necessary to collect that data and calculate those metrics. You can create, tweak, and define your metrics any way you want to. But you do not have to start from scratch. If you think about it. This One Model difference makes all the difference. Ultimately, you simply have to clean up your messy data. We recognize that. We’ve been through it before. And we make it part of the deal. Our customers choose One Model because we're raising the standard and setting the pace for people analytics. If you are spending time gathering and maintaining data, then the yardstick for what good people analytics is going to accelerate away from you. If you want to catch up, book a demo below and we can talk. Tell us you want to meet. About One Model: One Model helps thriving companies make consistently great talent decisions at all levels of the organization. Large and rapidly-growing companies rely on our People Data Cloud™ people analytics platform because it takes all of the heavy lifting out of data extraction, cleansing, modeling, analytics, and reporting of enterprise workforce data. One Model pioneered people data orchestration, innovative visualizations, and flexible predictive models. HR and business teams trust its accurate reports and analyses. Data scientists, engineers, and people analytics professionals love the reduced technical burden. People Data Cloud is a uniquely transparent platform that drives ethical decisions and ensures the highest levels of security and privacy that human resource management demands.
Read Article
Featured
5 min read
Phil Schrader
Analytics is a funny discipline. On one hand, we deal with idealized models of how the world works. On the other hand, we are constantly tripped up by pesky things like the real world. One of these sneaky hard things is how best to count up people at various points in time, particularly when they are liable to move around. In other words, how do you keep track of people at a given point in time, especially when you have to derive that information from a date range? Within people analytics, you run into this problem all the time. In other areas, it isn’t as big of a deal. Outside of working hours (sometimes maybe during working hours), I run into this when I’m in the middle of a spreadsheet full of NBA players. Let's explore by looking at an easy-to-reference story from 2018. Close your eyes and imagine I’m about to create an amazing calculation when I realize that I haven’t taken player trades into consideration. George Hill, for example, starts the season in Sacramento but ends it in Cleveland. How do you handle that? Extra column? Extra row? What if he had gotten traded again? Two extra columns? Ugh! My spreadsheet is ruined! Fortunately, One Model is set up for this sort of point-in-time metric. Just tell us George Hill’s effective and end dates and the corresponding metrics will be handled automatically. Given the data below, One Model would place him in the Start of Period (SOP) Headcount for Sacramento and End of Period (EOP) Headcount for Cleveland. Along the way, we could tally up the trade events. In this scenario, Sacramento records an outbound trade of Hill and Cleveland tallies an inbound trade. The trade itself would be a cumulative metric. You could ask, “How many inbound trades did Cleveland make in February?” and add them all up. Answer-- they made about a billion of them. Putting it all together, we can say that Hill counts in Cleveland’s headcount at any point in time after Feb 7. (Over that period Cleveland accumulated 4 new players through trades.) So the good news is that this is easy to manage in One Model. Team Effective Date End Date Sacramento 2017-07-10 2018-02-07 Cleveland 2018-02-08 --- The bad news is that you might not be used to looking at data this way. Generally speaking, people are pretty comfortable with cumulative metrics (How many hires did we make in January?). They may even explore how to calculate monthly headcount and are pretty comfortable with the current point in time (How many people are in my organization). However, being able to dip into any particular point in time is new. You might not have run into many point-in-time scenarios before-- or you might have run into versions that you could work around. But, there is no hiding from them in people analytics. Your ability to count employees over time is essential. Unsure how to count people over time? Never fear. We’ve got a video below walking you through some examples. If you think this point in time stuff is pretty cool, then grab a cup of coffee and check out our previous post on the Recruiting Cholesterol graph. There we continue to take a more intense look beyond monthly and yearly headcount, and continue to dive deeper into point-in-time calculations. Also, if you looked at the data above and immediately became concerned about the fact that Hill was traded sometime during the day on the 8th of February and whether his last day in Sacramento should be listed as the 7th or the 8th-- then please refer to the One Model career page. You’ll fit right in with Jamie :) Want to read more? Check out all of our People Analytics resources. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own. Its newest tool, One AI, integrates cutting-edge machine learning capabilities into its current platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data.
Read Article
Featured
4 min read
Dennis Behrman
Now that many of our customers have complete control and access to their data like never before, they're exploring how to tell better data stories. A fun way to explore this topic is to look at the great examples from the past. Minyard’s example shows us how to tell a story with data Let’s take this example forward to one of history's most famous data stories, Minard’s visualization of Napoleon’s 1812 march into Russia. Edward Tufte, a renowned expert in data visualization, praised Minard's visualization of Napoleon's 1812 march. In his book "The Visual Display of Quantitative Information," Tufte referred to Minard's graphic as "probably the best statistical graphic ever drawn”. Although not directly HR (other than in a dark way, a visualization of workforce attrition over time), we can still analyze how this visual fits into the framework of six effective data storytelling elements and apply those lessons to HR storytelling: Business Objective: In the context of HR, the objective was to convey a message and inspire action. Minard's visualization powerfully demonstrates the disastrous consequences of Napoleon's march, highlighting the importance of understanding the impact of decisions on people. Evidence: Minard's visualization uses data from multiple sources, such as the number of soldiers, their geographic locations, and temperature. In HR data storytelling, this would translate to gathering relevant data from various sources like employee engagement surveys, performance metrics, or attrition rates, to support the narrative. Visuals: Minard's visualization is a clear, engaging visual representation of complex data. Similarly, HR professionals should utilize data visualization tools to create visually appealing and easy-to-understand representations of workforce data. Narratives: Minard's map tells the data-informed story of the march's progression and the resulting loss of soldiers. In HR data storytelling, a compelling narrative should weave together the data and insights, making them relatable and memorable for the audience. Interactivity: While Minard's visualization is static, you could imagine leaders in the armed forces looking for cuts of this data by troop category, demographics, and nationality (pre-GDPR). Interactivity would have allowed Minard to engage quickly with the graphic to see different cuts of the data. HR professionals adapt their data stories based on the audience's questions and feedback, making the story more engaging and dynamic. Action: Minard's visualization serves as a cautionary tale, prompting leaders to consider the consequences of their decisions. In HR data storytelling, ending the story with a clear call to action can drive engagement and ensure the story leads to meaningful change within the organization. By analyzing Minard's data storytelling example in the context of the simple six-element storytelling framework, HR professionals can gain valuable insights on how to create data-informed stories that effectively communicate the human impact of organizational decisions and inspire meaningful change. Check out 8 Essential People Analytics Dashboards 1 Image Source Ready to tell better data stories with your people analytics data? Download our Data Storytelling eBook today.
Read Article
Featured
0 min read
Lauren Canada
This infographic reveals 4 key HR metrics to strengthen your next data story, so you can prevent costly turnover and retain top talent. Start scrolling to explore the piece!
Read Article
Featured
8 min read
Phil Schrader
I recently sat down with Culture Curated’s Season Chapman and Yuliana Lopez to ask them which metrics were their favourite and Yuliana said net hires. Let’s find out why: Net hires are a critical component of workforce management, as they help organisations determine staffing needs, forecast future headcount, and make informed decisions about recruitment and retention strategies. In this One Model blog post, we’ll explore the concept of net hires, how it’s calculated, and why it’s essential for organisations to track this metric. What are net hires? Net hires, also known as net hiring or net employment, is a measure that tracks the difference between the number of employees who leave an organisation and the number of new employees who are hired during a specific period. This metric provides valuable insights into a company's workforce dynamics, such as the rate of employee turnover, the pace of recruitment, and the organisation's overall hiring needs. This metric is an essential component of workforce planning and management, as it helps organisations to determine staffing needs, forecast future headcount, and make informed decisions about recruitment and retention strategies. For example, if a company hires 50 new employees during a quarter and loses 20 employees during the same period, the net hires for the quarter would be 30 (50 - 20 = 30). A positive net hires value indicates that the organisation is expanding its workforce, while a negative value indicates that the organisation is reducing its workforce. The chart below shows new hires juxtaposed against terminations. Explore several ways to visualize headcount here. I like using One Model for the presentation of this data because you can quickly adjust by any segment or time period to see how the story changes when looking at it from different angles. Calculating net hires To calculate net hires, organisations need to track the number of employees who join and leave the company during a specific period. This information can be obtained from various sources, such as HRIS records, payroll systems, and employee surveys. Once the data has been collected, organisations can use the following formula to calculate net hires: Net hires = Total number of new hires - Total number of terminations For instance, if a company hired 100 new employees and had 50 terminations during a specific period, the net hires for that period would be 50 (100 - 50 = 50). Having trouble balancing headcount with net internal movements? Learn more. Why is monitoring net hires important? Net hire headcount is a critical metric for organisations for several reasons. Firstly, they provide insights into the organisation's overall workforce trends, such as the pace of recruitment, the rate of turnover, and the company's growth trajectory. By tracking net hires over time, organisations can identify patterns and trends in their hiring practices and adjust their recruitment strategies accordingly. Secondly, net hires can help organisations by understanding the rate at which employees are joining and leaving the company, organisations can make informed decisions about their recruitment and retention strategies, including whether to ramp up hiring efforts, invest in employee training and development, or adjust staffing levels in response to changing market conditions. Finally, net hires can also help organisations evaluate the effectiveness of their recruitment efforts. By tracking the number of new hires, organisations can assess the success of their recruitment campaigns and identify areas for improvement. Additionally, by comparing net hires to other metrics, such as employee engagement and retention rates, organisations can gain a more comprehensive view of their overall talent management strategy. Challenges of tracking net hires While net hires are an essential metric for organisations, tracking this metric can be challenging. One of the main challenges is ensuring the accuracy of the data. HR records and payroll systems are prone to errors and inconsistencies, which can lead to inaccurate calculations of net hires. Moreover, tracking net hires requires a robust data infrastructure, including data collection, storage, and analysis tools. Another challenge is defining the period over which net hires should be calculated. Since you are measuring change over time, you could run into a situation where you get a zero result in calculated measures. In this case, having a tool that can understand and make sense of that is important. Organisations also need to determine whether to track net hires on a monthly, quarterly, or annual basis — depending on their specific workforce management needs. Moreover, organisations need to ensure that the period over which net hires are calculated is consistent across all departments and business units, to enable accurate comparisons. Optimising net hires To optimise net hires, organisations need to adopt a data-driven approach to recruitment and talent management. A key way to do that is by using people analytics tools to track and analyse workforce data, including net hires, turnover rates, and engagement levels. Final lessons from Season As you heard in the video, Season doesn’t like looking at one metric or a metric at a single point in time because it’s misleading. With that in mind, we know that net hires mean less if you don’t understand your termination metrics and recruitment rate. Remember to think of all the contributing factors and explore the data at your disposal to create a comprehensive story that creates value for your organization. The power of segmenting headcount In addition to looking at supporting metrics, you should also be segmenting your headcount audience to see if there are trends across departments or geography. Only looking at things as a whole may be misleading. That’s why using a tool like One Model with flexible storyboards is vital to put all the pieces of the same story on the same page. Make sure that a headcount dashboard is one of the first essential dashboards you build. Ready to Learn More? Watch me build this report live. Connect today.
Read Article
Featured
8 min read
John Carter
Effective vs. Ineffective Leaders Successful teams are measured by how well they achieve their set goals and metrics. Effective leaders use the tools available to them to create and set challenging, yet achievable goals. Effective leaders also encourage and enable their teams to be able to reach and even exceed their goals. Ineffective leaders fail to set achievable goals, in many cases, they are too easy just as frequently as they are too hard. Ineffective leaders fail to provide support and encouragement and also fail to recognize when adjustments need to be made. So how do we decipher which leaders are effective and which are ineffective? Well to start, we must first understand that this is not a finger-pointing exercise. This is a tool to uncover both effective and ineffective strategies that can help your management teams perform better. Great management is essential for any business to succeed, and manager effectiveness metrics can be used as a tool to measure success in developing top talent for your organization. These metrics are also crucial in identifying underachieving teams and identifying ineffective leaders who can be educated and uplifted. By tracking the manager performance metrics below, you can identify great leaders and uncover leadership techniques that can be shared. Why are manager performance metrics important? Organizations need managers who can create a productive working environment, communicate effectively with their team, make well-informed decisions quickly and consistently, and lead their team toward the company’s objectives. By tracking performance metrics in people analytics software, companies can discover effective management techniques and reward their top performers. Not only that, they now have a solid benchmark to compare and see if that new program led to measurable improvements. How to find your most effective managers with one visual Wait … Only ONE visual? I thought there were 8 metrics…. Stick with me, I’m not pulling a fast one on you. I’m about to cover each of these 8 metrics in detail, but first I want to make the point that these insights are more compelling when presented together. How to find your best managers, for example, look at the Team Leader Scorecard Example below. Using a heatmap view - we can quickly see how we gain so much more insight when we arrange and visualize them all together. Now, let’s dive into each metric. 1. Headcount growth rate Headcount growth rate is the percentage increase in the number of employees that a manager hires. It’s a good measure of manager effectiveness as it can show how well they can find and retain top talent while continuing to build their team. It is also important to understand what are the drivers behind headcount growth. Increased sales and production or perhaps correcting unbalanced workload distribution. How to calculate headcount. 2. Repeat low performers Repeat low performers are employees who consistently fail to meet set goals and expectations. Tracking this metric is important as it can show how effective a manager is at developing their people and providing them with the skills needed to succeed in their roles. It could also be an indication of a manager setting unrealistic goals and may re-evaluate the goal-setting process. 3. High performer termination rate Effective leaders who view employees as assets can help their company combat resignations by encouraging high-performing employees to stay with the company for longer. By tracking high-performer termination rate, organizations can identify managers who are losing top talent and uncover reasons behind the attrition. They can also seek to understand how effective leaders are succeeding in retaining their top talent and distributing this knowledge among all of their leaders. 4. Termination rate volume Termination rate volume measures the number of employees who leave an organization within a given period. This metric can identify any patterns in a manager’s employee turnover so that organizations can take steps to address any underlying issues causing high turnover rates. 5. Promotion rate By looking at promotions actioned across your organization, you can see the managers and work units with more people than average being promoted. This indicates which managers are best at growing talent. Tracking manager promotion rates is an important metric for assessing manager effectiveness and identifying potential areas of improvement. Also, learn about succession planning. 6. Female representation Female representation is an important metric for assessing manager effectiveness as it shows how well a manager is at promoting diversity and inclusivity within their team. By tracking this metric, organizations can identify any challenges with gender imbalance and take steps to address them. 7. Salary ratio by gender Salary ratio measures the difference in salaries between men and women within an organization. This metric can be used to identify any discrepancies with wage discrimination so that organizations can take steps to address them. Tracking manager salary ratios is important for assessing manager effectiveness and creating a fair and equitable workplace. 8. Diverse retention gap Diverse retention gap is the difference in retention rates between diverse and non-diverse employees. This diversity metric can be used to identify any issues with manager diversity and inclusivity so that organizations can take steps to address them. It took Phil a staggering 51 seconds to pull the chart above together. Take the People Analytics Challenge and see how long it takes you to answer 90 of the top people analytics questions. What about employee retention rate? Employee retention rate measures the percentage of employees who remain with the company after being hired by a manager. This metric is important for assessing manager effectiveness because it shows whether or not they are creating an environment that encourages long-term employment and job satisfaction. This is also a great metric to review and there are lots of ways to talk about it. Reference these articles for more information: Calculate the cost of turnover How recruiters impact employee outcomes How to learn from new hire failure Measuring manager success to stay on top of the game These manager effectiveness metrics represent the key performance indicators (KPIs) of manager success, and if you track them with people analytics software, then you can find the most effective leaders in your company. Keeping an eye on these measures is a great way to ensure that you are developing, retaining, and promoting the highest-performing talent in your organization. With this knowledge in hand, you can make sure that your company is always staffed with the best and brightest leaders possible. Good management in business, or should I say good team management, is key in today’s competitive talent market and these KPIs provide essential data points to measure success so that your team can stay on top of the game. Want to Learn More? Let's Connect. Fill out the form to schedule a call and demo.
Read Article
Featured
2 min read
Phil Schrader
It's an all too common scenario: a rush request coming in from the leadership team, in this case leadership in the finance department. They want to understand some cost impacts with our workforce and how it's changing. But it's Friday afternoon, so I want to try to get this request done as quickly and accurately as I can. Do you want to be able to knock out highly accurate, ad-hoc reports fast? Get a demo of One Model to see how!
Read Article
Featured
7 min read
Chelsea Schott
Retention rate and turnover rate are two distinct metrics that measure employee longevity in an organization. Understanding employee retention rate is essential for businesses to succeed. How to Calculate Employee Retention Rate Retention rate refers to the percentage of the workforce who remain with the organization for a specific period of time. It measures employee stability and shows the effectiveness of a company's efforts to keep employees engaged and satisfied with their job. Healthy retention rates mean greater expertise, increased productivity, and overall success for any organization. Using data from previous trends or even running an exploratory data analysis can give insight into solutions that make everybody happier in the long run. However, before we dive into the specifics of ways to calculate, it is also important I call out that finding “a retention rate” is only the beginning. In the modern age of people analytics, we need to have a people data platform that allows us to break our data down to draw unique understandings and insights. What does it look like for various groups? What’s the difference between exempt and nonexempt workers? Are our recruiters impacting our new hire retention? To manage your employee retention rate effectively, it’s becoming increasingly important to truly understand what goes into that retention rate. Measuring this can help explain why employees choose to stay and identify key focus areas to keep it that way. So, which retention rate formula to use? Retention rates can be tricky to calculate. There are multiple formulas that can be used for the metric and multiple factors to consider when calculating. One of the most common formulas involves dividing the number of employees at the end of a period by the number of employees at the beginning of a period. A glaringly obvious problem with this formula is the fact that it’s not taking into consideration any new hires or acquisitions taking place during the time period. A company’s retention rate could easily exceed 100% if there were more hires than terminations during the period, which wouldn’t be an accurate indicator of true employee retention. And even if excluding hires from the calculation, the retention rate would be better holistically, but not necessarily as good at a more granular level when considering employees’ internal movements within the company. There are also other retention rate formulas to consider: One Model has dimensions we've created to look back and identify employees who stayed or left the company after a given number of months (typically 6 or 12) from a specific time. We can use this dimension, called Is Future Terminated, along with a headcount metric to calculate the retention rate for a historical time period. The Is Future Terminated dimension is not only helpful for calculating a retention rate, but is also used in One Model’s One AI recipe to determine the likelihood of attrition for groups of employees. Read more about One AI here! Another popular retention rate focuses solely on a company’s new hires. Like the calculation used for One AI, the new hire retention rate will determine how many of the newly hired employees stayed with the company after a certain amount of time. As with the others, this calculation is good to look at new hires in the company overall, but can be complex if you are trying to determine new hire retention rate in certain departments or positions. For example, if an employee is hired in one department and transfers to a new department after six months, but terminates only one month after being in the new department, should the retention rate metric consider the new hire’s attributes – like department – at the time of hire or at the time of termination? Check out Josh Lemoine's blog: Learning from Failure: Why Measuring New Hire Failure Rate is Great! to see even more reasons why new hire calculations are important. The best news is that at One Model, we understand different companies and industries may have different metrics and measures that best represent them. We will work with you on this to determine the best method for your organization, helping you build the metrics that tell your story. One Model makes it easy and Phil shows us how: People Analytics Uncovers Factors Affecting Employee Stability What Can You Do About It? Once you have calculated your retention rate, you then need to determine the factors affecting your employee stability — which is where people analytics comes into play. People analytics generates business benefits and allows you to collect and analyze data on employee behavior and attitudes in order to enhance employee satisfaction. For example, if the data shows that certain departments have a significantly higher retention rate, you can look for common factors, such as the level of managerial support or training the employees receive, that may be contributing to this trend. Another strategy that can improve retention is to use the data to develop targeted workforce engagement programs. People analytics can provide information on what types of benefits, training opportunities, or other incentives are most likely to engage employees and improve retention. By using this information to create targeted engagement programs, you can make the most of your resources and increase your chances of success. You can also compare differences between those who stay versus those who leave and see if there are certain programs/paths that are working better. Employee Retention Rate in the HR Revolution The HR revolution is transforming the way businesses approach employee retention. Companies are investing in new technology and data-driven approaches to help them better understand and engage with their employees. This shift is enabling organizations to develop more effective retention strategies that are tailored to the unique needs of their workforce. By tracking and improving your employee retention rate, you can reduce the costs of turnover, enhance employee satisfaction, and build a more stable, productive, and knowledgeable workforce. Embracing the employee voice and utilizing people analytics software will also help you take a more data-driven approach to retain workers and stay ahead of the competition. Let Us Show You Building Retention Dashboards Today! Request a Demo
Read Article
Featured
2 min read
Dennis Behrman
Most large employers are already required by law to ensure that workers are safe and workplace risks are minimised as much as possible. But a new school of thought has emerged around the concept of well-being at work. Whether you've followed this trend closely or this is the first you're hearing of it, well-being has been studied and there is interesting data available about it. We Asked an Expert about Implementing Well-being at Work My colleague Richard Rosenow recently invited his good friend Matt Diabes, a Ph.D candidate at the Carnegie Mellon Tepper School of Business to discuss well-being in incredible new detail. His research demonstrates that well-being is far more complex than ping-pong tables and good pay. Watch their lively and informative discussion to understand what well-being is and how managers and organisations can harness the promise of well-being for great talent outcomes. If your organisation has thousands and thousands of workers whose well-being matters to you, you'll want to be able to measure well-being at your company. Find out how One Model can help you report on well-being and how to achieve organisational well-being goals. Request a Personal Demo to See How Well-being is Measured.
Read Article
Featured
11 min read
Josh Lemoine
Measuring New Hire Failure Rate in an actionable way and acting on the data will save your company money. In this blog, I'm taking a look at how your organisation can save significant sums of money and minimise workforce continuity risks by measuring and understanding your new hire failure rate. Since recruiting and onboarding new employees is expensive, retaining new employees past their earliest phase of employment is critical. When you reduce new employee turnover you save money. A powerful tool for enabling this change in your organization is measuring New Hire Failure Rate. What is New Hire Failure Rate? New Hire Failure Rate is the percentage of a group of hires that leave the company within a set period of time. More specifically, it's people hired during a specified time period who leave the company within a certain number of months divided by all of the hires from that specified time period. The time to termination is a lever that can be adjusted but generally ranges from 90 days to 2 years. It's a powerful measure because it spans recruiting, onboarding, and employment. A lot of data is captured during each of these phases, lending to a large number of factors available to analyze. Measures similar to New Hire Failure Rate include New Hire Retention Rate and New Hire Turnover Rate. Either one could be substituted for New Hire Failure Rate with a similar value proposition. New Hire Retention Rate is the same thing but the inverse and has a more positive name 🙂. It puts the focus on those who stay rather than those who leave. The New Hire Turnover Rate calculation is a bit easier to perform but the measure can be more difficult to interpret due to it being based on headcount rather than hires. Why is it costly? New hire failure is almost universally a negative thing. Even if you're losing hires who are not a good fit for your company, it's costly. Situations like seasonal holiday hiring at a retailer might be an exception in some cases but can be excluded from your analysis if necessary. Some specific reasons that losing employees early in their tenure is costly include the following: The rate is surprisingly high at many if not most companies. A quick internet search yields numbers in the 20% to 80% range. This article isn't going to cite specific numbers since plenty of other articles already do that and your company is unique. If you were informed though that half of your new hires leave in the first year would you believe it? If I were a leader in the Talent Acquisition or Human Resources areas, I would certainly want to know the rate at my company. Hiring and onboarding costs a lot of money. New hire failure increases the amount of both processes that need to happen. Monetary costs include the following. Talent Acquisition employee salaries Paid sources Training resources Time spent by hiring managers interviewing and onboarding people Companies get little productivity from employees who are not yet up to speed. Employees leaving early in their tenure are leaving before they're productive. People leaving teams is bad for morale of those teams. People in senior leadership leaving can be bad for morale of the entire company. Brand reputation can suffer. Why don't all companies measure New Hire Failure Rate? You'd be hard-pressed to think of a People Analytics metric that's more powerful and actionable than New Hire Failure Rate. So why isn't it usually a key performance indicator for Human Resources and Talent Acquisition teams? Calculating New Hire Failure Rate is surprisingly tricky Hires from a specified time period that terminated within a certain number of months divided by all of the hires from that specified time period sounds easy enough. But you have to ensure that both the numerator and denominator come from the same group of hires. So you need to know the hire date but also the termination date at the same time. And you need the differences between those dates bucketed so that you can adjust the "Time to Termination" between 3 months, 6 months, a year, etc. to find the sweet spot. You also have to offset the group of hires back from the current date to allow enough time to know whether the hire terminated or not. By this, I mean that if you're looking at New Hire Failure Rate within 6 months, you don't want to include hires from the past 6 months since you don't yet know whether they'll terminate within 6 months. New Hire Failure Rate Example: My colleague Phil Schrader, One Model's Solutions Architect, performed this new hire failure rate analysis from scratch in less than 5 minutes. Could you do that with your existing HR analytics today? Take the People Analytics Challenge today! The measure itself isn't actionable unless you know other things about the hire Knowing that your company has a high New Hire Failure Rate highlights that a problem exists but does not help you solve it. In order to improve retention, you need to know as much as possible about the hires who are leaving (and the ones that are staying for that matter). Luckily, companies leveraging modern applicant tracking, onboarding, and HRIS systems have a lot of useful data available. Unluckily, this data is often not available in a useful way. To improve your New Hire Failure Rate, you need to be able to slice it every which way to find the attributes and areas to focus on. Unfortunately.... The hiring process spans two separate teams and often two or more separate systems The Talent Acquisition and Human Resources functions both involve hiring but in most companies, they're two separate teams. Not only that but they often leverage two separate systems (ATS and HRIS) to manage their processes. Even companies who use one system such as Workday to manage both Recruiting and HR suffer from the data from the two functions not being cleanly linked together for analysis. On top of this, there's often data related to onboarding such as survey data. This is extremely valuable data when tied to outcomes like early tenure terminations. Unfortunately, many companies use a survey vendor separate from their ATS and HRIS vendors and obtaining survey results comes with its own set of challenges. How can companies measure it in an actionable way and save money? The first thing you need is a People Analytics team. A People Analytics team services both the Talent Acquisition and Human Resources functions. Since New Hire Failure Rate spans both teams, it's best to have a neutral third party reporting it. This should help prevent false assumptions about the causes of high rates stemming from the other team. There's also the word "Analytics" in " People Analytics", and some analytical prowess will be useful in tracking down the causes. Tracking New Hire Failure Rate is only valuable to a company if they act on the findings. The function of a People Analytics team is to provide actionable insights, so they're well-positioned to maximize the impact of the measure. A People Analytics team needs the right tools in order to be successful. The best tool to measure New Hire Failure Rate is a People Analytics platform. A People Analytics platform provides: All of the data in one place and joined together in one data model (subliminal hint) Core HR data such as Business Unit, Job Level, Location, and Manager Recruiting data such as Application Source, Time to Hire, and Recruiter Candidate Survey results Onboarding Survey results A complex yet intuitive way to deal with time All of the attributes structured into dimensions for grouping and filtering the data A compelling visualization layer for distributing the insights to the people who can act on them Watch my colleague Phil Schrader perform a similar analysis in One Model At this point, it should be clear that performing a one-off analysis of the drivers of New Hire Failure Rate would be very difficult. How can companies achieve even more success? Saving your company money was mentioned in the introduction to this article. In this article, Phil describes how you can leverage One Model to calculate source costs and cost per hire. If you know how much it costs to hire someone, you know how much money you’re losing when they leave the company right away. Being able to go to leadership with dollar figures, even if they’re estimates, can be a very powerful driver of change in your organization. Last but certainly not least, companies can maximize success in measuring New Hire Failure Rate by leveraging Machine Learning. This is a great use case for a causal analysis highlighting drivers of new hire failure. An advantage of performing this type of analysis using machine learning is that it’s far more efficient than doing it manually. A tool like One Model’s One AI is able to take all of the attributes from all of the data sources described in this article and run them through a classification algorithm, returning the most predictive of both new hire failure and retention. It can do this in an intuitive way that doesn’t require Data Science skills. If that sound too tricky, embedded insights in One Model powered by One AI can deliver various onboarding retention statistics right within storyboards. Most things that save you money in the long run require some up-front investment. Measuring New Hire Failure Rate is no exception. Like installing solar panels save you more in the long run than installing water barrels, leveraging a People Analytics team and platform to measure New Hire Failure Rate will be much more impactful than a one-off analysis. This is an opportunity to achieve quantifiable results and further cement the value proposition of People Analytics teams. The answers are closer than you think. Let us show you. Request a Demo
Read Article
Featured
8 min read
Phil Schrader
Turnover is the strongest signal you get from your workforce. Someone worked here, and — for one reason or another — it didn’t work out. Voluntary termination of employment is a major event, and you need to pay attention to the reasoning (and the data) to help you with employee retention. While some degree of turnover is inevitable, the high cost of losing an employee can have a major impact on your bottom line. So, how do you calculate voluntary termination, and what can you do to combat it? Let's take a closer look. How to Calculate the Cost of Turnover There are a number of ways for calculating the cost of turnover. The most common (but less accurate method) is to multiply the average salary of the position by the number of separations. For example, if you have 10 employees who make an average salary of $50,000 per year and five resign, the turnover cost would be $250,000 ((10 x $50,000) / 2). However, this method needs to take into account the time it takes to find and train replacement employees. A more accurate way to calculate the cost of turnover is to use a formula that factors in recruiting costs, training costs, and lost productivity. Using this formula, the cost of turnover for our example above would be closer to $37,500 (((10 x $50,000) + ($5,000 x 10)) / 2). The ability to calculate voluntary attrition internally will bring a new dimension to your leadership team. However, these benchmarks serve as a baseline for your turnover calculator as there are several other variables and data points to consider, including: Daily rate of the hiring manager’s salary Estimated hours spent interviewing and screening resumes Estimated cost of advertising for the available position Daily rate of departed employee’s salary plus benefits Number of days the position will remain open before you rehire Cost to conduct a background check Daily rate of the hiring manager’s or trainer’s annual salary Total days the hiring manager or trainer will spend with new employee Number of working days in the new hire’s onboarding period It’s also important to factor in position levels. The Center for American Progress (CAP) found that the cost of staff turnover was, on average, 213% of the annual salary for highly-skilled employees. Segment the positions into three different salary levels for a more accurate turnover calculator: average entry level, average mid-level, and average technical salaries. Tracking Voluntary Attrition Over Time The simple truth is that you will not get a full picture of what is happening from a single calculation. It can be time-consuming to calculate on a consistent basis over time manually. You need to see the trends. Create a storyboard to see if trends emerge. You can break down the involuntary and voluntary attrition rate by business unit, location, and organization tenure groupings. You can also quickly see at a glance how turnover rates are changing, where they are high, and whether it’s you or the employee forcing the change. It took me 49 minutes to pull this cost of turnover visual together from scratch. How long does it take you to answer question #59? See how quickly you can take the People Analytics Challenge and answer over 90 of the top questions asked of People Analytics Teams. You Know How to Calculate Churn Rate of Employees and Turnover Cost — Now What? Now that we know how to calculate the cost of turnover let's look at how you can mitigate voluntary attrition in your organization. Use One AI Recipes to Predict Trends and Make Proactive Changes You know the “What” but you really need to know the “Why”. So, run a predictive model on that data to pull out the correlations and understand the why behind the attrition. One AI Recipes will help you predict the likelihood of a person in a selected population voluntarily terminating within a specified period of time. To do this, One AI will consider a number of attributes and will train the model on the population at a defined point in the past. Does distance to the office, time since last promotion, or paid sick leave correlates to a rise in attrition? AI is a tool that helps you make connections and better understand voluntary resignation reasons in order to take specific actions leading to improvement. Improve Hiring Practices Poor hiring practices could easily be one of the reasons why your voluntary attrition rate is high. Be sure to clearly define the skills and experience required for each position and only interview candidates who meet those criteria. Conduct thorough reference checks, and don't hesitate to pass on a candidate if there are any red flags. After analyzing your hiring process, you can incorporate the proper people analytics data to produce the most accurate cost of losing an employee. Promote from Within Another great way to reduce turnover is to promote from within whenever possible. Not only does this show your employees that there are opportunities for advancement within your company, but it also helps reduce training costs because you already have someone on staff who knows your company culture and how things work. Invest in Employee Development Finally, investing in employee development is a great way to reduce turnover rates. Employees who feel like they are learning and growing in their roles are more likely to stick around. Offer professional development opportunities through tuition reimbursement programs or paid memberships to professional organizations. Final Thoughts on How to Calculate the Cost of Turnover Sure we want to understand how much it costs the company. That is the first step in getting leadership to care. However, the real work begins when you understand why people are leaving and can build a plan to curb the costs. People analytics offers real-time labor market intelligence to help businesses identify pain points causing turnover. And considering the high cost of losing an employee and its impact on your bottom line, employee retention is critical in today’s economy. One AI Recipes make creating a predictive model from your people data as easy as choosing the outcome you want to predict and answering a series of questions about the data you want to leverage to make the predictions. The result is a predictive model based on robust, clean, and adequately structured data — all without engaging a data engineer or data scientist. Calculating turnover is the first step toward helping you understand and predict trends, reduce turnover rates, and keep your business running smoothly. Watch Me Build A Turnover Analysis Live Request a Personal Demo Today.
Read Article
Featured
7 min read
Chris Butler
The employee survey still is perhaps the most ubiquitous tool in use for HR to give their employees a voice. It may be changing and being disrupted (debatable) by regular or real-time continuous listening and other feedback mechanisms. Regardless, employee survey data collection will continue. I am, however, constantly amazed by the amount of power that is overlooked in these surveys. We’re gathering some incredibly powerful and telling data. Yet, we barely use a portion of the informational wealth it holds. Why? Most organizations don’t know how to leverage the confidential employee survey results correctly and maintain the privacy provisions they agreed with your employees during data collection. The Iceberg: The Employee Survey Analytics You're Missing Specifically, you are missing out on connecting employee survey answers to post-survey behaviours. Did the people who said they were going to leave actually leave? Did the people who answered they lack opportunity for training, actually take a training course when offered? Did a person who saw a lack of advancement opportunities leave the company for a promotion? How do employee rewards affect subsequent engagement scores? And of course, there are hundreds of examples that could be thrown out there, it is almost a limitless source of questioning, you don’t get this level of analysis ROI from any other data source. Anonymous vs. Confidential Surveys First, let me bring anyone who isn’t familiar with the difference up to speed. An anonymous survey is one where all data is collected without any identifiers at all on the data. It is impossible to link back to a person. There’s very little you can do with this data apart from what is collected at the time of questioning. A confidential survey, on the other hand, is collected with an employee identifier associated with the results. This doesn’t mean that the survey is open, usually, the results are not directly available to anyone from the business which provides effective anonymity. The survey vendor that collected these results though does have these identifiers and in your contract with them, they have agreed to the privacy provisions requested and communicated to your employees. And a number of survey vendors will be able to take additional data from you, load it into their systems and be able to show a greater level of analysis than you typically get from a straight survey. This is better than nothing but still far short of amazing. Most companies, however, are not aware that survey vendors are generally happy (accepting at least) to transfer this employee-identified data to a third party as long as all confidentiality and privacy restrictions that they, the customer, and the employees agreed to when the survey was collected. A three-way data transfer agreement can be signed where, in the case of One Model, we agree to secure access to the data and maintain confidentiality from the customer organization. Usually, this confidentiality provision means we need to: Restrict the data source from direct access. In our case, it resides in a separate database schema that is inaccessible by even a customer that has direct access to our data warehouse. Provide ‘Restricted’ metrics that provide an aggregate-only view of the data, i.e. only show data where there are more than 5 responses or more than 5 employees in a data set. The definition of how this is restricted needs to be flexible to account for different types of surveys. Manage Restricted metrics as a vendor, preventing them from being created or edited by the company when a restricted data set is in use. Support employee survey dimensionality that adheres to this restriction so you can’t inadvertently expose data by slicing a non-restricted metric by a survey dimension and several other dimensions to create a cut to a population that otherwise may be identifiable. Get Ready to Level Up Employee Survey Analysis! Your employee survey analytics can begin once your survey data is connected to every other data point you hold about your employees. For many of our customers that means dozens of people data sources across the recruit to retire, and business data spectrums. Want to know what the people who left the organization said in their last survey? Three clicks and a few seconds later and you have the results. Want to know if the people you are recruiting are fitting in culturally and which source of hire they were recruited from Or if low tenure terminations show any particular trends in engagement, or culture responses? Or whether people who were previously highly engaged and have a subsequent drop in engagement have a lack of (choose your own adventure) advancement|compensation|training|skilled-peers|respect for management? Literally, you could build these questions and analysis points for days. This is what I mean, a whole new world opens up with a simple connection of a data set that almost every company has. What can I do? Go and check your last employee survey results and any vendor/employee agreements for how the data was to be collected and used. If the vendor doesn’t state how it’s being collected, check with them, often they are collecting an employee identifier (id, email, etc). If you are lucky you might have enough leeway to designate a person or two within your company to be able to run analysis directly. Otherwise, enquire about a data transfer agreement with a third party who will maintain confidentiality. I’ve had this conversation many times (you may need to push a little). If you don’t have data collected with an identifier, check with HR leadership on the purpose of the survey, and the privacy you want to provide employees with and plan any changes for integration into the next survey. This is a massively impactful data set for your people analytics, and for the most part, it’s being wasted. However, always remember to respect the privacy promise you made to employees, communicate how the data is being used and how their responses are protected from being identified. With the appropriate controls, as outlined above, you can confidentially link survey results to actual employee outcomes and take more informed action on the feedback you collected in the employee survey analysis. If you would like to take a look at how we secure and make survey data available for analysis, feel free to book a demonstration directly below. Ready to see us Merge Employee Survey Data with HRIS Data? Request a Demo!
Read Article
Featured
6 min read
Jamie Strnisha
In today's competitive business landscape, it's more important than ever for workplaces to provide value to employees, customers, and investors. Attracting top talent, boosting productivity, enabling innovation, and improving employee experience are all key goals to achieving higher value. Many progressive companies are accomplishing these goals by tracking facility analytics — including attendance tracking and workplace tracking — to make facility improvements. That’s right. Companies use facility analytics to improve their environments and retain top talent and you should too. What is Facility Analytics Facility analytics is the process of collecting data about how a workplace is being used daily. Space utilization data is one type of data that can be collected (through sensors, badge swipe, surveys, or observation studies) and integrated into your overall people analytics data lake. With this workplace analytics data, managers can use it to transition to make proactive, positive changes in the company’s culture and work environment. This can even include transitioning away from assigned workspaces to flexible shared spaces and remote work. Since the pandemic, many companies have made changes, like these, and were able to reduce property costs and optimize public use spaces. After analyzing which departments are using which spaces, changes can even be made to bring teams closer together to improve mobility, increase employee connections and boost productivity. Why Use Facility Analytics and Workspace-Related Data? There may be some that are afraid facility analytics may be too intrusive but, done correctly, it could actually be a critical tool in improving the overall workplace experience. A few examples of what time and attendance people analytics can help track include: The best days for employee gatherings (monitor the days where most employees are in the office). Collaboration among employees by using survey data to track communication and teamwork trends. Employee burnout and workload to determine what generative attributes are leading to turnover. Office movement to know where employees’ desks are versus where they actually worked. Hoteling policy and proximity analysis so managers can see who is using the hotel desks and if additional desks are needed. Energy-saving initiatives, such as changing the temperature or adding motion-sensor lights in unused workspaces. Office activity and meal planning. Contagious illnesses tracking (e.g., COVID-19) to identify and mitigate risks in the organization by knowing who is in the office on any given day. Facility data in people analytics can also be used to track employee productivity. If you find that teams are travelling long distances to meetings, you can move teams and/or encourage online meetings to reduce travel time and increase work availability. In addition, if you see that some people are consistently at the office for long periods of time, you may be able to intervene to prevent burnout. How to Use Attendance Tracking to Future-Proof Your Facility Offices provide social interaction, creativity options, and collaboration. Your goal should be to design a work environment that meets those needs and more. COVID-19 has changed the workplace as we know it. The pandemic gave many people a taste of remote work they never had. With so many employees working remotely, companies are starting to realize that the traditional 9-5 in an office setting may not be necessary. In fact, Forrester Research found that 60% of companies are now utilizing hybrid schedules where employees can work from home and in the office. This "next normal" workplace will require a new way of thinking when it comes to managing employees and facilities. Luckily, when you merge survey data, facility data, and your HRIS, you can start to understand how best to meet business objectives and employee needs by team, cohort, or distance from the office. Ways to Capture Workplace Analytics Wi-Fi Sensors Common Workplace Analytics can be tied to People Analytics and provide a more in-depth understanding of your people. Wi-Fi sensors can be used to track employee movement and usage of common areas. This data can then be used to make adjustments to the layout of the office, as well as cleaning and sanitizing schedules. Additionally, Wi-Fi sensors can be used to send alerts to employees when they enter an area that has been recently cleaned or sanitized. Mobile Apps Mobile apps can be used for a variety of purposes, such as monitoring attendance analytics, sending notifications and alerts, and providing access control to certain areas of the office. Reservation Systems Reservation systems allow employees to book workspace in advance. Additionally, they can be used for location analysis and people analytics (e.g., tracking employee usage of common areas). Badging Data Badging data refers to the workspace-related data collected by security badges that employees wear. This data can be used for a variety of purposes, such as tracking employee movement, identifying trends, attendance tracking, and improving security protocols. Get Maximum Value From Analytics Workspace With One Model Unlock your valuable facility analytics and attendance tracking data with One Model and specialized data modeling that enables you to extract, aggregate, and analyze your data like never before. See for Yourself. Connect with Us Today. Facility analytics is a powerful tool that today's workplaces can use to improve employee experience and boost productivity. One Model seamlessly connects your facility tracking data with other third-party resources — such as a Human Resources Information System (HRIS), Integrated Workforce Management System (IWMS), and surveys about the workspace — to help you improve the workplace and stay ahead of the competition.
Read Article
Featured
7 min read
Phil Schrader
People analytics provides insight into your organisation’s workforce. Your company’s workforce is at or near the top of your organisation’s expenses and strategic assets. Describing the importance of people analytics is very much an exercise in stating the obvious. For this reason, more and more companies are relying on people analytics, and that reliance is growing even as economic conditions change. In fact, as economic conditions become more challenging, people analytics becomes more, not less, important. Imagine a pilot flying in bad weather. Data on altitude, speed, location, etc become even more critical in that context. So yes, it makes sense to invest in people analytics now, even amidst our current economic concerns. People analytics in a recession is one of the most measurable strategies that HR can pursue. Whether you are hiring during a tight labor market or working through the implications of layoffs and reorganizations, you will want accurate, multi-dimensional, effective-dated, relational analytics ready to guide your decisions. People analytics doesn’t just help organise HR data. It generates faster insights from widely-dispersed HR data to make better talent decisions. For example, your people teams can better manage workforce and staffing levels, maximise productivity, and avoid guesswork about their diversity and inclusion objectives. “New and improved” HR reports alone won’t cut it. With people analytics, your analysts and managers can run exploratory data analysis to connect and understand relationships, trends, and patterns across all of their data. Additionally, the analysis adds context and meaning to the numbers and trends that you’re already seeing. The advantages of people analytics and why you should budget for it in a recession. Advantage #1 - Save money with people analytics. For nearly every business, labor is one of its most significant costs. But human capital is essential to generating revenue. HR analytics provides strategic and tactical visibility into one of your organisation’s most vital resources - its people. When your company uses analytics to manage the right people out, it can also use analytics to help you focus your recruitment efforts. After all, replacement costs for an employee can be as high as 50% to 60% with overall costs from 90% to 200%. For example, if an employee makes $60,000 per year, it costs $30,000 to $45,000 just to replace that employee and about $54,000 to $120,000 in overall losses to the company. HR analytics can also become a strategic advisor to your business to show insights into how your organization is changing. For example, people analytics can track trends in overtime pay, pay rate change for various positions, and revenue per employee (to name a few). While the revenue per employee calculation is a macro number, it’s important for you to be attuned to how it’s changing. Knowing the trends of your revenue per employee can lead directly to asking important questions about your people strategy: Are we investing in people now for future revenue later? Are we running significantly leaner than we have in the past? Are we running too lean? If metrics like revenue per employee or overtime pay are dropping or increasing over time, it could indicate that adjustments need to be made on a departmental level. Advantage #2 - Identify trends affecting morale or productivity. People analytics can also help you identify trends within your workforce that may be negatively affecting your business. HR data can help you pinpoint what is causing the change, and then address these issues early so you can avoid potential problems down the road. For example, Cornerstone used metrics such as policy violations and involuntary terminations to identify “toxic” employees harming the company’s productivity. The findings showed that hiring a toxic employee is costly for employers — to the tune of $13,000. And this number doesn’t even include long-term productivity losses due to the negative effects those toxic employees had on their colleagues. Source. With people analytics, Cornerstone identified common behavioral characteristics of toxic employees and now uses this data to make more informed hiring decisions. This created immediate benefits for their existing employees as well as future advantages as their workforce evolved. Advantage #3 - Recruit and retain top talent. The many benefits of people analytics also include a competitive edge when it comes to recruiting and retaining top talent. By understanding the needs and wants of your employees, you can create a workplace that is more attractive to potential candidates. In a world where data is constantly being updated, it's important for talent acquisition and HR leaders to make informed decisions quickly. HR analytics gives them that power at speed (rather than waiting months before seeing what happened). Using AI to discover related qualities of your top performers can also help your acquisitions team select candidates that will fit well into your culture and start driving results. Advantage #4 - Identify high-performing departments. Another one of the advantages of HR analytics is its ability to pinpoint positive changes as well. HR leaders can track department performance to know when to reward or incentivize employees for their productivity and work ethic. Additionally, it can help you keep your employees happy and engaged, which is essential for maintaining a high level of productivity (and sales). For example, Best Buy analyzed its HR data to discover that a 0.1% increase in employee engagement resulted in more than a $100,000 increase in annual income. Further, AMC’s people data showed that the theaters with top-performing managers earned $300,000 more in annual sales than the other theaters. These HR insights also helped this Fortune 500 company identify top talent and ideal candidates for its managerial positions, which ultimately resulted in a 6.3% increase in engagement, a 43% reduction in turnover, and a 1.2% rise in profit per customer. Identify Trends With Real-Time Labor Market Intelligence Ultimately, HR analytics offers real-time labor market intelligence to help businesses identify pain points causing turnover — something that’s essential in today’s hiring climate infamously referred to as “The Great Resignation.” The rise in turnover rates is a nationwide problem. It’s important for companies to find out why their employees are leaving and then create an effective strategy so they can stop the trend before it gets worse. One Model’s people analytics software can be a valuable tool for any business, especially during a downturn. In short: You should budget for HR analytics as an investment, not a cost. If you’re worried about a recession, you can start performing complex analysis on your data in just a few weeks. Let us show you 1:1
Read Article
Featured
5 min read
Stacia Damron
Is your company meeting its diversity goals? More importantly, if it is, are you adequately measuring diversity and inclusion success? While we may have the best intentions, today’s companies need to be focused on not just monitoring hiring metrics - but effectively analyzing them - in order to make a DE&I difference in the long term. But first, in order to do that, we need to take a look at key metrics for diversity and inclusion success. Let's talk about these diversity KPIs we’re measuring and why we’re measuring them. Without further ado, here’s 4 out-of-the box ways to measure diversity-related success that don’t have to do with hiring - all of which can help you supplement enhance your current reporting. Number 1: Rate and Timing of an Individual’s Promotions Are non-minority groups typically promoted every year and a half when minorities are promoted two years? Are all employees held accountable to the same expectations and metrics for success? Is your company providing a clearly-defined path to promotion opportunities, regardless of race or gender? Every hire should be rewarded for notable successes and achievement, and promoted according to a clear set of criteria. Make sure that’s happening across the organization - including minority groups. Digging into these metrics can help determine those answers and in the very least – put you on a path to asking the right questions. Number 2: Title and Seniority Do employees with the same levels of educational background and qualifications receive equitable salaries and titles? Often, minorities are underpaid compared to their non-minority counterparts. Measuring and tracking rank and pay metrics are two good ways to spot incongruences catch them early – giving your company a chance to correct a wage gap versus inadvertently widening it over time. Quantitative measures of diversity, like this, can help you see trends over time because changing diversity turning radius is a long process. Keep your eye on historically underpaid groups. A fairly paid employee is a happy, loyal employee. Number 3: Exposure to Upper Management and Inclusion in Special Assignments Global studies cited in a Forbes article revealed that a whopping 79 percent of people who quit their jobs cite ‘lack of appreciation’ as their reason for leaving. Do your employees – including minority groups - feel valued? Are you empowering them to make an impact? Unsurprisingly, people who feel a sense of autonomy and inclusion report higher satisfaction with their jobs – and are therefore more likely to stay. Are all groups within the organization equal-opportunity contributors? Bonus: On that note - are you performing any types of employee satisfaction surveys? Number 4: Training and Education Programs and Partnerships In 2014, Google made headlines for partnering with Code School. They committed to providing thousands of paid accounts to provide free training for select women and minorities already in tech. Does your company have a similar partnership or initiative with your community or company? As simple as it sounds – don’t just set it and forget it - track the relevant diversity KPIs that determine success and measure the results of your programs to determine if it is in fact, helping achieve your commitments towards improving diversity. The Summary: Success Comes by Measuring Diversity and Inclusion Hopefully, one of two (heck - maybe all four) of the items above resonated with you, and you’re excited to go tinker with your reporting platform. But wait - what if you have all this data, and you WANT to make some predictive models and see correlations in the data - and you’re all giddy to go do it - but you don’t have the tools in place? That’s where One Model can help. Give us your data in its messiest, most useless form, load it into our platform, and we’ll help you fully leverage that data of yours. Want to learn more? Let's Connect About Diversity Metrics Today. Let's get this party started. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
9 min read
Phil Schrader
People analytics teams tend to shy away from calculating revenue per employee. It’s a very macro number. On its own, a single revenue per employee calculation tells you very little. Even large differences in revenue per employee do not necessarily mean anything or, at least, are not particularly “actionable” when immediately discovered. If I’m being honest, I feel like we people analytics folks have collectively decided we’re just a little too nuanced in our thinking to risk spending time on a data point that can be explained away almost as easily as we can calculate it. But then again, it’s pretty easy to calculate. And, while it is a very macro number, you still ought to be attuned to how it’s changing. I check the weather, not the climate, before going out camping for the weekend, but that doesn’t mean it isn’t important to know if the climate is changing. So put aside your nuanced expertise with me and let’s do some basic calculations! How to calculate revenue per employee? The most simplistic view of this metric is to use this formula: The average revenue per employee = Revenue during period / Number of employees in that period Ok, so let’s say you get a number like $200,000. So what? It’s not necessarily good or bad. There are happy shareholders out there whose companies have a lower revenue per employee and unhappy shareholders whose revenue per employee is much higher. So next you want to look at how that is trending over time, like so. (Source: One Model Storyboard Visualization for revenue per employee using test database) Trending is probably the best way to look at this data. Ideally, the number trends up, right? But it might not always. Perhaps you are growing and investing heavily in building out teams that will deliver revenue in the future. Perhaps your employee mix is changing. You’re adding call center employees instead of research scientists. Perhaps perhaps perhaps. And so, even though we might be able to explain any changes away at this point, that’s exactly the value of making the simple calculation in the first place. The resulting thought process leads directly into asking really fantastic questions about your people strategy. Are we investing in people now for future revenue later? Are we running significantly leaner than we have in the past? Maybe too lean? Pulling this data together into a simple graph took me 5 minutes and 24 seconds. Find out how long these types of insights take YOUR team to generate with the the People Analytics Challenge. Variations to Consider on Revenue per employee by industry Thinking about direct vs indirect contributors Ok, let’s get back to our thoughts about call center employees vs research scientists. Both are valuable but you would expect an organization with more of the latter to have a higher revenue per employee. So in your own analysis, you can do a couple of things next. Segment by employee roles and do the same trend analysis. Again the face value number might be meaningless. “We make $10,000,000 per accountant.” The trend, however, IS interesting. “Say, we seem to need a lot more accountants per dollar of revenue than we used to. I wonder why that is?” You could switch to a harder calculation like revenue per dollar of compensation. (check out Mike West. Estimating Human capital ROI. p.102 of People Analytics for Dummies.) While you may not always be able to attribute revenue earned to specific departments, it will give you more insights if you monitor changes over time. If revenue per employee is dropping or increasing over time, it could indicate that adjustments need to be made on a departmental level. Switching to net vs gross If you do find some interesting insights as you trend out revenue per employee, you may want to check whether you are aligned with finance on the dollar amount that best aligns with the company's business strategy. If not, presenting this information to the C-level could be a mistake. You may find it relevant to look at net income or the amount left after all expenses. Here are the various levels of income to think about. Gross Revenue Sales discount = Net Revenue Cost of goods = Gross Profit Operating expenses = Operating Profit or EBIT Loan Interests = Profit before taxes (PBT) Taxes = Net Income If you’re losing money, Net Income could help you determine which areas of the business may be contributing to that more than others. Again, these revenue per metrics are more of an early warning system than a root cause analysis. Use them as a scanner to home in on more specific analyses. Once you get going, you can start to cut the data in more ways to see what stands out. Grouping by Seasonality Measuring this metric by quarter or by month may give you an idea of how the business cycle evolves. However, be sure to compare similar quarters YoY (year over year) or Months YoY to get a better idea of if things are improving or getting worse over time. Considering the Role of Employee Turnover These last few years have been tough. As HR leaders, we know that getting new employees can directly impact the productivity of the company. If you have had a lot of turnover, you may want to align dates to see how this has impacted your revenue per employee ratio. Rather than celebrating a spike in revenue per employee, you might raise the alarm that you are stretched much thinner than before. Group by Tenure If new employees cost more, then your most tenured employees must be worth the biggest bang for your buck, right? Breaking your data out by various cohorts will give you a better idea of how effectively you're building your talent. Do you want to see how One Model builds the insights you need in under 6 minutes? Request a Demo Today! How to find a Revenue per employee benchmark for your industry? Another upshot of this relatively simplistic metric. You can often estimate it for publicly traded companies. The easiest way to do this is to choose at least 3 to 6 public companies in your space. Because they are publicly traded, finding that information on the investor relations section of their site is much easier than trying to collect that same intel on private companies. In most cases, you can find that information over time as well. Once you have that, finding the estimated number of employees can be found on many database websites, or even on Linkedin. That said, revenue per employee by the industry for private companies can require a substantial amount of work. Look for investor information, and search for news articles on earnings. You Found Your Average Revenue per Employee and Benchmark - Now what? For starters, don’t panic (or celebrate). If you notice big differences between your organization and others. A competitor may use a much larger number of contingent workers who don’t appear in their employee counts. Or have a very different support model. Again this might feel like it devalues the comparison because so much just depends. That said, doing the calculation suddenly gets you thinking things like, “Hey has their people strategy changed significantly?” “Do they use more/less contingent staff?” These are great questions! How Can You Change Average Revenue per Employee? Now that you’re in a strategic mindset, start thinking about the levers you have at your disposal in order to adjust these numbers: Add Additional Hires focusing on their ability to contribute to your top and bottom lines. [Fun Game] Make a bet with yourself about how it will impact revenue per employee in the short and long term. See if your performance and engagement data trends align with the departmental trends you’re seeing. Are high-performing teams trending up in revenue per employee? If so, how might you quantify the value and find new ways to invest in these areas? Create some buzz around the metric. Tell people that, of course you know it’s a macro data point, but get others thinking about how it’s changing. Take a break from fighting over how to measure promotion rates and enjoy the landscape view. If paired with a high turnover rate, work to retain high-performing employees Look at tools that may increase the efficiency of your workforce Develop management to maximize performance in each of their departments.
Read Article
Featured
7 min read
Nicholas Garbis
How do we measure the value of people analytics? Is your organization making better, more data-informed talent decisions today versus a year ago? This is the ultimate test of any people analytics (PA) program, initiative, team, COE, or department. If the answer is yes, the investments in PA continue and expand. If the answer is no, then PA budgets are questioned. So how can we demonstrate the value of people analytics? In our latest whitepaper, "Measuring the Value of People Analytics," we address this from the ground up, starting with the mission of people analytics and moving into the utilization of the content delivered by the PA team. With a more comprehensive view of the how PA creates value, you will be better positioned to build your business case for people analytics. Whether you are seeking initial, incremental, or transformational level investments, this value framework will help you to convince your organization to become fully invested in HR analytics. Tackling the ROI Conundrum The proposed ROI calculations that many vendors recommend for people analytics are not very good -- and some are downright laughable. This is one of the reasons I worked on this paper. Two common approaches: Estimated savings through efficiencies of system consolidation or process acceleration Estimated savings from the consolidation of systems or accelerating processes. Reduction in attrition or faster time-to-fill of job postings or other KPIs. The promise that PA technology will reduce turnover and putting a financial value on it ... then hiding when the 'Great Resignation' starts or saying 'it would have been worse!' Ugh! This is not honest or helpful. We can do better in declaring the value we propose to generate. This is one of the key points in the paper. This blog highlights some of the key elements of the whitepaper. You will definitely want to read the whole thing. Click here to get the full version. We Need to Address the Big “Why?” Why are we investing in people analytics? Is the deliverable we are committing to - the “return” on the investment - as simple as a bit of system and process savings and some hypothetical lift to a couple of KPIs? Mission of People Analytics: Drive better, faster, talent decisions at all levels of the organization. We are investing resources in people analytics to drive and accelerate this mission. The value of people analytics should be judged by the quality of talent decisions that are being made across the organization. We may not be able to get directly at measuring the quality of talent decisions (though we will address that in an upcoming paper), but we can use utilization as a proxy to get started. If our PA deliverables are being utilized, we can logically assume that the users are placing value on them. They are 'voting' for the content. If it was not valuable, they would ignore it. In the paper, we demonstrate how utilization can be used to calculate value with relative ease across your PA portfolio. Value Journey for People Analytics Looking at each 'analytics event' through a process sequence, a "value journey," we will see how critical PA content is in delivering value at scale. To impact talent decisions at all levels of the organization, we need to build a smooth and fast self-service cycle (left side) by focusing on: creating analytics mindset/culture, applying user-centered product design, and communicating effectively and applying sound change management. "We have data that can help here." The diagram below shows the target picture, where a user, encountering the talent elements of a business challenge thinks "We have data that can help here." This is the critical first step that ideally flows them into a set of high-quality PA products that can deliver the needed insights. Any business challenge can be divided into talent elements (staffing, skills, productivity, etc) and non-talent elements (market forces, supplier issues, etc). People analytics provides value through products and services that support understanding and solving for the talent elements of the challenge. To impact talent decisions at scale requires PA teams to deliver insight-generating self-service solutions. So now that we’ve covered that, how do we measure the value of people analytics at your company? Is there a formula we can use to make our PA investments more intentional? If so, how can we determine: where we should focus our efforts? What content or communications efforts are necessary to deliver the outcomes we expect? Another core assumption in people analytics is that your leaders’ time is a scarce and valuable resource. And we will use that assumption to anchor our value measurement approach. We assume that your organization’s leaders: Are selective about what they spend their time on. Choose to spend their time on things they consider valuable. See value in content if they engage with it regularly. Will rely on content that continues to inform better talent decisions over time. Download the paper to see the way we have calculated the value of a small PA portfolio based on the value-utilization framework. Further work is needed to articulate how to measure the change in talent decision quality more directly. We will be tackling that in future content -- so keep an eye out for it! Get the equation in our Measuring the Value of People Analytics Whitepaper Ready to see how upgrading your people's analytics solution will improve the value your team is bringing to the business?
Read Article
Featured
17 min read
Chris Butler
Workday vs SuccessFactors vs Oracle Ratings Based on Experience Integrating HR Tech for People Analytics This vendor-by-vendor comparison will be a living post and we will continue to update as we have time to collect thoughts on each vendor and as we complete integrations with new vendors. Not every source we work with will be listed here but we'll cover the major ones that we often work with. At One Model we get to see the data and structure from a load of HR systems, and beyond, basically anything that holds employee or person data is fair game as a core system to integrate for workforce analytics. After more than a decade of HR analytics integration architecture experience where the solution is directly integrating data from these systems into analytics and reporting solutions, we have a lot of experience to share. Below I'll share our experience with highlights from each system and how they align with creating a people analytics warehouse. Some are better than others from a data perspective and there's certainly some vendors that are yet to understand that access to data is already a core requirement of buyers looking at any new technology. Bookmark this blog, add your email to the subscription email list to the right, or follow me (Chris Butler) and One Model on LinkedIn to stay up to date. A Quick Note on HRIS Platform Ratings Ratings are provided as an anecdotal and unscientific evaluation of our experience in gaining access to, maintaining, and working with the data held in the associated systems. They are my opinions.] If you would like to make use of any of our integrations in a stand-alone capacity, we now offer a data warehouse only product where you utilize just our data pipeline and modelling engine to extract and transform data into a data warehouse hosted by One Model or your own data warehouse. We'll be releasing some more public details soon but you are a company that likes to roll your own analytics, visualizations, and just need some help with the data side of the house, we can certainly help. Contact Us Cloud HRIS Comparison Workday One Model rating - 2.5/5 Method - API for standard objects, built-in reporting for custom objects (via reporting-as-a-service, or "RaaS") The Good - Great documentation, Easy to enable API access and control of accessible fields, and Good data structures once you have access. The RaaS option does a good job but is limited. The Bad - Slow; Slow; Slow; No custom fields available in API, Geared towards providing a snapshot, number of parallel connections limited, constant tweaking required as new behaviors identified, Expert integration skills required; True incremental feeds require you to read and interpret a transaction log Workday Requires a Custom-Built People Analytics Integration Architecture Workday analytics embedded into the product is underwhelming and we're yet to see Prism Analytics make a dent in filling the needs that people analytics teams or HR analysts have beyond convenience analytics. So in the meantime, if you are serious about improving reporting and people analytics for Workday you're going to need to get the data out of there and into somewhere else. On the surface, Workday looks to have a great API, and the documentation available is excellent. However, the single biggest downfall is that the API is focused on providing a snapshot, which is fine for simple list reports but does not allow a people analytics team to deliver any worthwhile historical analysis. You don't get the bulk history output of other systems or the ability to cobble it together from complete effective-dated transactions across objects. To capture the complete history we had to build an intense process of programmatically retrieving data, evaluating, and running other API calls to build the full history that we need. If you want more detail take a look at my blog post on the subject The end of the snapshot workday edition. The complexity of the integration, therefore, is multiplied and the time taken suffers immensely due to the object-oriented architecture that requires you to load each object into memory in order to be able to retrieve it. A full destructive data extraction means you're looking at 8+ hours for a small-medium enterprise and expanding to a week if you're a giant. The problem is exacerbated by the number of parallel connections allowed to run at a fraction of the stated limit. A full historical API integration here is not for the faint of heart or skill, we have spent 12+ months enhancing and tweaking our integration with each release (weekly) to improve performance and solve data challenges. Our integration to give a sense of scale generates some 500+ tables that we bring together in our modelling engine in preparation for analytics. Beware of Oversimplifying the API Integration Out-of-the-box integration plugins are going to be focused on the snapshot version of data as well so if you don't have the integration resources available I wouldn't attempt an API integration. My advice is to stick with the built-in reporting tools to get off the ground. The RaaS tools do a good job of combining objects and running in a performant manner (better than the API). However, they will also be snapshot focused and as painful as it will be to build and run each timepoint you will at least be able to obtain a basic feed to build upon. You won't have the full change history for deeper analysis until you can create a larger integration, or can drop in One Model. Robert Goodman wrote a good blog a little while back looking at both the API and his decision to use RaaS at the time, take a read here. Workday API vs RaaS Regardless of the problems we see with the architecture, the API is decent and one of our favorite integrations to work with. It is, however, little wonder that with the data challenges we have seen and experienced, half of our customers are now Workday customers. One Model Integration Capabilities with Workday One Model consumes the Public Web Service API's for all standard objects and fields. One Model configures and manages the services for API extractions, customers need only to create and supply a permissioned account for the extraction. Custom objects and fields need to use a Raas (Report as a service) definition created by the customer in the Enterprise Interface Builder (EIB). The Report can then be transferred by SFTP or can be interacted with as an API itself. Figure 1: One Model's data extraction from Workday SuccessFactors One Model rating - 4/5 Method - API The Good - A dynamic API that includes all custom MDF data!! Runs relatively quickly; Comprehensive module coverage; The Bad - Several API endpoints that need to be combined to complete the data view; Can drop data without indication; At times confusing data structures 4 out of 5 is a pretty phenomenal rating in my book. I almost gave SuccessFactors a perfect 5 but there are still some missing pieces from the API libraries and we've experienced some dropped data at times that have required some adaptations in our integration. Overall, the collection of SF APIs is a thing of beauty for one specific reason: it is dynamic and can accommodate any of the Meta Data Framework (MDF) custom changes in its stride. This makes life incredibly easy when working across multiple different customers and means we can run a single integration against any customer and accurately retrieve all customizations without even thinking about them. Compared to Workday where the API is static in definition and only covers the standard objects this facet alone is just awesome. This dynamic nature though isn't without its complexities. It does mean you need to build an integration that can interrogate the API and iterate through each of its customizations. However, once it is complete it functions well and can adapt to changing configurations as a result. Prepare to Merge API Integrations for People Analytics Multiple API endpoints also require different integrations to be merged. This is a result of both upgrades in the APIs available in the case of the older SuccessFactors API and the OData API as well as providing an API to acquired parts of the platform (i.e. Learning from the Plateau acquisition). We're actually just happy there is now an API to retrieve learning data as this used to be a huge bug bear when I worked at SuccessFactors on the Workforce Analytics product. The only SF product I know of right now that doesn't have the ability to extract from an API is Recruiting Marketing (RMK) from the jobs2web acquisition, hopefully, this changes in the future. Full disclosure, I used to hate working with SuccessFactors data when we had to deal with flat files and RDFs, but with the API integration in place, we can be up and running with a new SuccessFactors customer in a few hours and be confident all customizations are present. Another option - Integration Center I haven't spoken here about the new Integration Center release from earlier last year as we haven't used it ourselves and only have anecdotal evidence from what we've read. It looks like you could get what you need using the Integration Center and deliver the output to your warehouse. You will obviously need to build each of the outputs for the integration which may take a lot of time but the data structure from what I can tell looks solid for staging into an analytics framework. There are likely a lot of tables to extract and maintain though, we currently run around 400+ tables for a SuccessFactors customer and model these into an analytics-ready model. If anyone has used the Integration Center in an analytics deployment please feel free to comment below or reach out and I would be happy to host your perspective here. One Model Integration Capabilities with SAP SuccessFactors One Model consumes the SF REST API's for all standard fields as well as all customized fields including any use of the MDF framework. One Model configures and manages the service for API extractions, customers need only to create and supply a permissioned account for the extraction. SF has built a great API that is able to provide all customizations as part of the native API feed. We do us more than one API though as the new OData API doesn't provide enough information and we have to use multiple endpoints in order to extract a complete data set. This is expertly handled by One Model software. Figure 2: One Model's data extraction from SuccessFactors Oracle HCM Cloud (Fusion) One Model rating - 2/5 Method - HCM Extracts functionality all other methods discounted from use The Good - HCM Extracts is reasonable once you have it set up. History and all fields available. Public documentation. The Bad - The user interface is incredibly slow and frustrating. Documentation has huge gaps from one stage to the next where experience is assumed. API is not functional from a people analytics perspective: missing fields, missing history, suitable only for point-to-point integrations. Reporting/BI Publisher if you can get it working is a maintenance burden for enhancements. HCM Extracts works well but the output is best delivered as an XML file. I think I lost a lot of hair and put on ten pounds (or was it ten kilos?!) working through a suitable extraction method for the HCM Cloud suite that was going to give us the right level of data granularity for proper historically accurate people analytics data. We tried every method of data extraction from the API to using BI Publisher reports and templates. I can see why people who are experienced in the Oracle domain stick with it for decades, the experience here is hard-won and akin to a level of magic. The barriers to entry for new players are just so high that even I as a software engineer, data expert, and with a career spent in HR data many times over, could not figure out how to get a piece of functionality working that in other systems would take a handful of clicks. Many Paths to HRIS System Integration In looking to build an extraction for people analytics you have a number of methods at your disposal. There's now an API and the built-in reporting could be a reasonable option for you if you have some experience with BI Publisher. There are also the HCM Extracts built for bulk extraction purposes. We quickly discounted the API as not yet being up to scratch for people analytics purposes since it lacks access to subject areas, and fields, and cannot provide the level of history and granularity that we need. I hope that the API can be improved in the future as it is generally our favorite method for extraction. We then spent days and probably weeks trying to get the built-in reporting and BI Publisher templates to work correctly and deliver us the data we're used to from our time using Oracles on-premise solutions (quite a good data structure). Alas, this was one of the most frustrating experiences of my life, it really says something when I had to go find a copy of MS Word 2006 in order to use a plugin that for some reason just wouldn't load in MS Word 2016, all to edit and build a template file to be uploaded, creating multiple manual touchpoints whenever a change is required. Why is life so difficult?? Even with a bunch of time lost to this endeavour our experience was that we could probably get all the data we needed using the reporting/BI publisher route but that it was going to be a maintenance nightmare if an extract had to change requiring an Oracle developer to make sure everything ran correctly. If you have experienced resources this may work for you still. We eventually settled on the HCM Extracts solution provided that while mind-numbingly frustrating to use the interface to build and extract will at least reliably provide access to the full data set and deliver it in an output that with some tooling can be ingested quite well. There are a number of options for how you can export the data and we would usually prefer a CSV style extraction but the hierarchical nature of the extraction process here means that XML becomes the preferred method unless you want to burn the best years of your life creating individual outputs for each object tediously by hand in a semi-responsive interface. We, therefore, figured it would be easier, and enhance maintainability if we built our own .xml parser for our data pipeline to ingest the data set. There are .xml to .csv parsers available (some for free) if you need to find one but my experience with them is they struggle with some files to deliver a clean output for ingestion. With an extract defined though there's a good number of options on how to deliver and schedule the output and reliability is good. We've only had a few issues since the upfront hard work was completed. Changing an extract as well is relatively straightforward if you want to add a field or object you can do so through the front-end interface in a single touchpoint. We do love Oracle data, and don't get me wrong - the construction and integrity are good and we have a repeatable solution for our customer base that we can deliver at will, but it was a harrowing trip of discovery that to me, explains why we see so few organizations from the Oracle ecosystem that are out there talking about their achievements. Don't make me go back, mommy! Want to Better Understand How One Model can Help You? Request a Demo Today. Other HRIS Comparisons Coming Soon ADP Workforce Now
Read Article
Featured
11 min read
Jamie Strnisha
One of the most common reporting challenges companies face is balancing headcount over time by adding and subtracting Hires and Terminations. How to Calculate Headcount? The process seems like it should be simple, especially when someone with a background in finance or accounting first looks at the issue. The misconception is that the company’s headcount will balance in much the same way money in a financial statement balances, where the analyst takes an initial amount of money the company has, and adds the money that came in for the month (e.g. customer sales, invoices) and subtracts the money that went out for the month (e.g. transportation, payroll cost, rent) and results in a final amount for the month, which then starts over the next month. If that’s how a financial statement is balanced, it seems that the same concept should be easily applied to balancing headcount reporting metrics. It might seem that a company should be able to use the following formula: Starting Headcount + Hires – Terminations = Ending Headcount And everything would balance and net out. Unfortunately, accounting does not seem to work out the same way in HR as it does in Finance. Rarely (if ever) does this simple formula work when counting people instead of money. There are a number of common reasons why this formula fails when applying it to the reconciliation of headcount reports over time: People that start at the beginning of the month are included in both starting headcount and hires for the time period, leading to some of the same people being counted twice. People that leave at the end of the month are included in both ending headcount and terminations for the time period, again leading to double counting. People that are on leave of absence may suddenly enter or exit headcount without hire or termination. The company may have restrictions on certain types of workers (e.g. Interns, Contractors) and exclude them from the headcount when they are in that category. If these workers move from an excluded category to one that is included in the headcount, or vice versa, they might suddenly appear or disappear from the headcount without hire or termination. The company may want to exclude certain Hire and Termination actions, such as acquisitions or divestitures, which again will cause an unbalanced headcount and a worker to suddenly appear or disappear from the headcount Fortunately, One Model can solve all of these issues in balancing headcount relatively easily by creating a new set of metrics specific to the company’s data that include populations that might not normally be counted, and exclude or include Hires and Terminations at the beginning and end of a time period. The New Metric: Reconciliation Headcount This Reconciliation Headcount Reporting Metric is effectively a more accurate mathematical equation that balances headcount to reflect these quirks in people data to match the financial statement approach to reconciliation. Each customer that works with One Model will have a slightly different version of a Reconciliation Headcount metric based on the methodology they use to count a Hire or Termination. An example formula for this metric may look like this: (Ending Headcount + Terminations on the Last Day of the Previous Time – Terminations on the Last Day of the Current Time Period) – (Starting Headcount + Hires – Terminations – Divestitures) When properly constructed, the new metric will correctly sum to 0, eliminating the problem HR sometimes has in justifying apparent irregularities in reconciling headcount. If the Metric does not equal 0, it means that there is at least one person or event in the data that does not have a requisite hire or termination to balance it out and that the company should investigate the record. One Model can also provide the company with a set of metrics that explain the difference between the events and populations that are included in the inputs for the new metric calculation and what the company would otherwise use for standard reporting on headcount, hires and terminations. One Model Application: Using the Reconciliation Headcount Metric Once the Reconciliation Headcount Metric is created, it can be used to monitor and understand data changes over time that might not be apparent in a less refined approach to reconciliation. The following is an example of the Headcount Reconciliation Metric for Company A across 4 months. If all factors that affect headcount are included in the reconciliation calculation, then each month the Headcount Reconciliation Metric would show as 0. In this example, it appears that the headcounts for November and February have no irregularities, but for December and January, instead of the expected 0, the results show 1 in December and 2 in January. Pinpointing the Discrepancy: Is it Even Possible? Without the Reconciliation Headcount Metric and One Model, it could be difficult to pinpoint the source of these discrepancies. In fact, it might not be possible at all, depending on how the data was reported. If an analyst was lucky enough to be using lists of individuals to perform the reconciliation and had the actual records for all relevant points in time (the beginning and end of each month), they might be able to figure out the specific people accounting for the differences in December and January by using vlookup formulas in Excel to locate each difference. Of course, this would make the entire reconciliation process very time consuming Reconciling headcount may not even be possible in all situations. In some cases, the analyst may be adding or subtracting data from past months’ reports that have already been aggregated. Using aggregated data would make the reconciliation process almost impossible, since the data in the source system may have changed since the reports were run, and the analyst would not be able to pinpoint the specific people whose situations are creating the discrepancies. One Model’s List Report Feature: Easy Identification of the Discrepancy in Headcount Reporting Metrics The One Model platform has a unique feature that eliminates all of these problems and makes the reconciliation process very easy. This feature is called List Reports. In One Model, a user can take a metric and then look within it to find the data points that are causing the discrepancies. In the example of Company A, where it appeared that there were discrepancies in the December and January headcount reports, the analyst creates a List Report that includes the Headcount Reconciliation Metric, Worker Number and Name of every individual accounted for in that period. Any individual whose status changed during the time period but was not properly accounted for in the reconciliation process would be flagged as a + or - in the Headcount Reconciliation column. The List Report can then be filtered to show only those individuals whose records are the cause of the apparent accounting error. In the example of Company A above, there was a net discrepancy of one for the month of December. By exporting the data and filtering out the 0s, only one record had a +1 in December: In only a couple simple steps, it was easy to determine that Joe Williams’s record is the source of the discrepancy in the headcount for December. Identifying the Reason for the Discrepancy and Rebalancing Headcount After identifying Joe Williams, the next question is why his record caused this discrepancy. Since it appears that his record caused an addition to headcount, it may make sense to first look at the data for hires and see if a new code was added to Company A’s HRIS that was not included originally. In the example for Company A, Joe entered Company A through an acquisition that was not coded as a hire. As a result, it now is apparent that the Headcount Reconciliation metric should be revised to include individuals who joined Company A through an acquisition. After Headcount Balances, What Next? Net Internal Change This Headcount Reconciliation Metric can now be used to better understand net internal change within Company A. In the example below, Company A’s Headcount Reconciliation Metric is broken out by Department. In disaggregated form, it’s easy to see that in December the company had 1 net move into Commercial and 1 net move out of HR. Even more helpful, it’s possible to see that while Headcount balanced at the overall level for November and February, there were actually movements across departments in those months. The fact that those movements netted to zero made them seem to vanish from the reconciliation metric, but One Model still makes it possible to identify this movement. One Model’s List Report Identifies the Individual Change Records Looking again at December, adding the Department field into the List Report reveals a department change for a different worker. In this situation, we see that Chris Jones moved from HR to Commercial in December. Using the Reconciliation Headcount Metric, makes it possible to look at internal movements and understand how the company’s headcount has changed internally over time. Difference between Typical Internal Movement Metrics and the Headcount Reconciliation Net Change While customers can traditionally use events like Transfers, Promotions, Demotions to pinpoint internal movement, these methods can often be deceiving. Very often customers do not have strict business processes about what is being counted in these movements and events in the HRIS are coded as Transfers when they’re technically a data correction. A Promotion may get coded as such when it really is a Transfer or Lateral move because the manager wants to send a positive message to an employee. While the net difference derived from the Headcount Reconciliation Metric doesn’t necessarily resolve all of those issues, it allows the analyst to see the specific internal net change across time. The examples above used months, but the time period could have been any (e.g. year, quarter, week). If you want to know more about One Model or Headcount Reconciliation, we’d be happy to talk to you. Personally, I love talking about people data and how to construct metrics to drive business decisions. Want to learn how your company can benefit from using One Model? Have questions on your team's specific challenges in balancing headcount and internal net movements? Learn more about the benefits of One Model and sign up for a demo.
Read Article
Featured
8 min read
Dennis Behrman
Brilliant talent decisions require superb data. But how do you know what decision power lies in your data? Test it for yourself! Our People Analytics Challenge can give you a sense for the state of your human resources data and your organization's ability to make great decisions. To find out just how prepared your organization is to make brilliant talent decisions with the data you have, download our handy worksheet here. People Data Cloud™, One Model’s leading people analytics platform, can transform how you make talent decisions by making the human resources and related enterprise data that you already have access. Unlocked and properly harnessed people analytics can be an incredibly valuable asset to your company. Understanding the People Analytics Challenge Where did we get the questions? Our very own Phil Schrader (One Model's Solution Architect), along with his peers across our business, spent decades in the human resources function applying data analysis to common HR decisions and solving talent challenges. Phil compiled the most common questions into this worksheet. He then used our proprietary People Data Cloud technology to produce the tables, charts, graphs, and reports to demonstrate how a true people analytics capability can help HR practitioners. For sh*ts and giggles, we also timed how long it took Phil to arrive at these answers. Here are some of those questions. Question #74: What is our new hire failure rate, by tenure and by department? Phil's time on this KPI: 4 minutes 45 seconds Finding out where your new hires fail, and at what point in their tenure, gives you a clear place to focus attention. Addressing new hire failures with this data reduces the average cost to hire and gives the organisation continuity and increased productivity. Here's the most important people metrics of all time. Question #1: What is our revenue by employee Phil's time on this KPI: 5 minutes 24 seconds Building a straightforward report like this doesn't take much time, but One Model also has a lot of pre-built views that interpret your data after ingestion and create the data visualization for you. Let's see how quickly Phil can get information like this. Are you ready to take the People’s Analytics Challenge? Download our whitepaper and put your team to the test. Follow the hashtag #peopleanalyticschallenge on LinkedIn and let us know if you can beat Phil! Question #38: What is ratio of managerial to non-managerial employees, and how does this vary by department? Phil's time on this KPI: 2 minutes 43 seconds It actually took Phil longer to take the screenshot I needed for this blog post than it did for him to pull the answer to this question. Now, let's pick a question that may require a little debate on how the metrics are built. This question will require Phil to meet with key players in the company to ensure the customizable calculations are exactly what we need to get the best insight. Question #25: What is the depth of our leadership pipeline within the company? Phil's time on this KPI: 44 minutes 56 seconds Benchmarks and succession planning in people analytics are not always measured the same from company to company. To make our timing as realistic as possible, we put together a quick meeting with our VP of Strategy and mapped out the best methodology for the three charts. If you'd like to better understand Succession Planning and see how the build was done, watch the video on this post. After we aligned on the variables that we needed to include and how to piece them together, connecting the dots did not take much time, about 15 minutes. Do you have a team of data scientists at your fingertips? All People Analytics Enterprise Solutions customers get a custom Blueprint and ongoing support to get the most out of their data. As the final part of this challenge, I want you to ask your team this question, “Can we trust the data?” If your team is working tirelessly in spreadsheets or SQL, they can probably tell you exactly how those metrics were calculated. However, if you were using some other HR analytics reporting tools, you’re trusting a black box. Moving to One Model not only means you can get to the analysis you need in record time, but you also have a fully transparent platform that is adjustable to meet the specific requirements of your organization. Are you Stumped? Get a Demo: Regardless of where your organization is on the maturity scale, this “challenge” can help you determine the selection of the right people analytics key performance indicators and analyses for your business context. Arguably any strong people analytics function should be able to answer these KPI questions, but success is found in focusing on the questions that matter and will drive business outcomes. Take the #PeoplesAnalyticsChallenge and let us know how fast you can pull that information in your organization.
Read Article
Featured
21 min read
Nicholas Garbis
Retailers are riding a supercharged shopping cart full of change that has accelerated due to the pandemic and exacerbated by a one-company megatrend: Amazon. Amazon has over 10,000 people … just in Kentucky! That’s more than many retailers' entire organization. Accordingly, workforce issues are some of the biggest strategic challenges that the rest of US retailers face today. People analytics, which aims to improve decisions involving employees, work, and business objectives, can deliver immediate impacts to retailers by bringing better data and insights to leaders at all levels of the organization who are making workforce decisions every day. Every retailer has sufficient velocity and scale to make them great candidates for the enormous value that can be captured through analytics, modeling, and insight generation. People Analytics (PA): The application of data and insights to improve business outcomes through better decision making regarding people, work, and business objectives. (Source: Explore the Power of People Analytics, Whiteman and Garbis, 2020) Retailers once blazed the early trails of people analytics. In the first wave, from 2005-2015, big retailers were at the forefront. Unfortunately, the retail industry has mostly been surpassed in their people analytics prowess by peers in industries such as technology, financial services, and pharmaceuticals over the past decade. Perhaps too many of those analytics leaders moved out of retail and into other industries? The pressure of today’s HR challenges in retail should inspire us to find the innovative spark once again. Retailers are in a great position to drive change in their organizations through the use of effective people analytics. Using people analytics technology, they can unlock significant value and show how workforce issues are understood and resolved. There’s really no choice but to apply the best decision-making practices possible toward solving the workforce challenges. The entire business model depends on it. How might data help us to better understand this issue? How can we use data and insights in deciding the actions we should take? Find out more by downloading our Retail Whitepaper If you’re a human resources leader in retail, you may be inspired to rise to these challenges by applying people analytics to get the most value from your workforce data. Amazon has been rolling over traditional retailers, capturing market share and exacerbating workforce challenges. Challenge #1: Workforce Retention Is Not Just A Stores Issue Anymore Retail has always had high turnover rates, but the truly eye-popping annual rates of 100%+ were limited to the store locations. These rates were treated as an accepted fact, a cost of doing business, and an operational challenge for management. In the past, only modest efforts were made in response to turnover, such as faster training to decrease time-to-productivity, recruitment automation to decrease open rates, and better benchmarking of prevailing local wages through labor market analysis. Now warehouses and distribution centers are exhibiting turnover rates that rival the stores. Turnover rates in DCs and warehouses historically ranged from 30-40% annually. Today, retailers see figures approaching 100% or more. Wow! In the meantime, store turnover has remained persistently elevated. In response to this challenge, retailers have adopted a “hole plugging” approach that involves ramping up recruiting resources to backfill departing employees. The analogy to a draining bathtub fits; the drain keeps opening wider, and stores keep trying to open the faucet further in response. Retail hourly workers are leaving for wage increases in many cases, but they are also leaving for concerns such as scheduling practices, paid time off policies, Covid and other safety protocols, and career growth opportunities. And the competition across big box retailers is being compounded by Amazon’s exploding demand for workers to fill its massive warehouse expansion. People Analytics to Consider for Workforce Retention: New Hire Failure Rate. This metric involves calculating the portion of hires that do not last for 30 or 60 days at a given location, district, region, or across the total chain. Analysts would then explore “hot spots” and address the causes to improve recruiting sources, selection and evaluation criteria, onboarding processes, etc. Drivers of Turnover. Advanced analytical methods can help to determine the key drivers of turnover for stores and warehouse locations as well as up through the organizational hierarchy. Cost of Turnover. This metric helps HR teams calculate the cost to replace an average store or warehouse worker. Analysts should provide this information to leaders to ensure that these costs can be used in informing turnover reduction strategies. Cluster Analysis. This type of analysis helps to determine if there are groups of similar locations that have significantly different turnover rates. HR retail teams can then conduct qualitative interviews to discover best practices that can be tested in other locations. Challenge #2: The Digital Talent Squeeze The competition for digital talent seems to be growing more fierce every day. Talented developers, software engineers, product managers, and data scientists are moving between a wide range of industries and in/out of start-ups. Most significantly, Amazon, Google, Apple, Facebook, and Microsoft are swooping up talent by the truckload, bidding up salaries and emptying the shelves of more cost-conscious retailers. The competition is so unrelenting that newly-hired digital talent is even being coaxed away between their offer acceptance and their start date. Wage compression is a significant concern as starting pay for highly-prized new hires approaches that of their more experienced peers and even their leaders. The impact to retailers extends beyond the costs and frustrations of hiring and losing of new hires. Constantly open positions impede progress on digital innovation that retailers desperately need to remain competitive, such as customer-facing solutions that meet changing shopping patterns and automation in the supply chain and stores. As retailers struggle to fill these digital roles, they are becoming more open to remote and location-flexible talent. This is providing them with a wider talent pool to recruit from, but retailers may find it difficult to manage these exceptions as they return to physical offices when the pandemic subsides. Retailers who push too aggressively on a return to the office could lose the talent that they worked so hard to secure. Or they may half-knowingly end up with a two-tier policy where remote work is only available to those with rarified skills. People Analytics to Consider for the Digital Talent Squeeze: Internal Mobility Rate. This metric involves calculating the rate at which the digital talent moves into different roles. Stagnant pockets of ‘hoarded’ talent should raise concerns since that talent will eventually find opportunities outside (rather than inside) the organization. Recruiting Funnel Analytics. HR retail teams should identify the phase of the recruiting process where the most-qualified talent voluntarily drops out of the candidate pool. Within this broader withdrawal rate analysis you can look at the rate of offers being declined. There’s also plenty of value in diagnosing where the speed of the recruiting process can be improved. Retention Surveys. Develop surveys that create a deeper understanding of the factors that keep critical digital talent in their roles and get specific data to help learn if your employee value proposition (EVP) is clear and compelling for talent in key roles. Challenge #3: Seasonal Staffing Models May Be Broken In retail, the holiday season accounts for about 20% of annual sales, but occur during a 10% slice of the calendar. For roughly five weeks, every aisle and sub-aisle gets clogged with stacks of TVs, toys, and gadgets. Every register light is blinking and joyful music plays in stores. This is shopping nirvana, and retailers are at the center of everyone’s life. But staffing up to deliver on this experience is becoming more and more difficult. Some retailers have quantified lost sales due to their inability to staff at necessary levels. The tight labor market is not just due to competition in the mall or across the parking lot. Changing shopping patterns have forced retail warehouses to hire incredible numbers of temporary workers for the holidays. Limited supply and high demand forces temporary worker wages to increase, and now its not uncommon for a temporary seasonal worker to be enticed to another retailer during the short holiday season. So the seasonal staffing model that has worked for so many years may have finally broken. It may be impossible (or cost prohibitive) to find and ramp up the number of workers that are needed in such a short period of time. Rather than increasing recruiting efforts or recruiting earlier, some retailers are experimenting with more durable, lasting relationships with temporary workers. An “occasional” or “intermittent” employee type could keep workers active in the HR system if they work a certain minimum number of shifts within a certain period of time. There are many advantages to “occasional employment” that address the challenge of holiday staffing. Retailers can benefit from decreased recruiting costs, reduced paperwork, faster onboarding and time-to-productivity, higher retention rates, and more experienced customer service. Naturally, there are some costs associated with keeping more active employees on the books. If an occasional employee works just one shift per month, then that employee will probably be less productive than a full-timer on a per-hour basis. However, when the holiday season arrives, that occasional employee will be better prepared and more reliable, reducing the overall staffing costs of the store. An effective “occasional employee” strategy may require a change to a more flexible shift bidding and selection system. It’s also worth considering whether incentives can be created to encourage more hours. In ride sharing apps like Uber, surge pricing is built into the system. Perhaps there are surge wage accelerators for certain shifts? People Analytics to Consider for Seasonal Staffing: Seasonal Staffing Rate. Retailers can calculate the increase in staffing, including both headcount and total hours worked) that each store or warehouse has experienced.Some locations will have more hours from seasonal workers than others. Focus on where the biggest opportunities for impacts are. Seasonal Staffing Rehire Rate This metric helps to determine how successful a retailer has been at re-recruiting seasonal workers. Ideally, rehires should be associated with lower recruiting and onboarding costs and higher retention rates. This measure informs efforts to sustain relationships with seasonal workers outside of peak periods. Seasonal Staffing Cost Analysis. Retailers can develop a comparative cost model for sustaining “occasional” workers with the current seasonal staffing model. This foundational measure can lead to a cost-effective and timely staffing model alternative. Download the Whitepaper to Share with Colleagues Challenge 4: Unionization A labor relations leader at a large global firm once said, “No site ever got a union that didn’t deserve one.” His point was that unionization is a direct result of management’s failure to provide to workers the things they value most. In retail, workers appear to value: Wages that can sustain a worker who is working full time Transparent and timely scheduling practices, including schedules being published with sufficient and consistent advance notice. Pay for shifts that are cut short due to low customer or warehouse volume Acceptable policies and practices for bathroom breaks Limits to scheduling beyond stated availability or short-notice extra hours; Participation in paid time-off policies; Career advancement opportunities It’s notable that most of these items are management practices and policies that have no direct additional costs. Recently, we’ve witnessed some high-profile unionizations, including one Amazon warehouse (out of 110 total) and eight Starbucks stores (out of 15,000 total). These unionizations have sent shock waves through retailer headquarters. Many more union drives are currently in progress across the industry. In response, retail leaders are looking back through their aging “union avoidance” playbooks. Many of these are leaked to the public and are somewhat unflattering for their brands. There is a positive side to the story. The desire to unionize indicates that workers want to keep their jobs and do not want to quit outright. Unionization, therefore, is an opportunity for the retailer and their workforce to align on their shared and individual values and desires. Unionization drives are also signals that there may be an issue with the quality of the front-line management, and possibly the broader company culture and policies. The retail industry is built on the idea of scale, where each store or warehouse is running the “same play” with minimal variation. Retailers are not well-equipped for managing an ever-widening range of policies across unionized and non-unionized locations. Unionization presents significant challenges that include logistical and communications activities, negotiations at each unionized site, management of a portfolio of agreements, and the possibility of creating new benefits plans, training programs, and so on. People Analytics to Consider Regarding Unionization: Site Stability Index. Retailers should create a “balanced scorecard” that covers all locations. The scorecard should include external information such as competitor openings, nearby union participation, local unemployment rates, and relevant legislative changes in that jurisdiction. Survey Analytics. Retailers should employ a continuous pulse survey strategy with appropriate sampling and rotating questions. This information can be correlated with turnover, mobility, leadership stability, manager performance, ombuds claims, and a sales or productivity versus plan report. Union / Non-Union Analytics. Retailers can learn from analytics that compare similar stores along union vs. non-union dimensions. Navigating complicated waters like this requires good information. Challenge #5: Making Data-Informed Workforce Decisions Beyond being a bit circular in the context of this paper, this last challenge should be considered very real and concerning. Every retailer with over 10 stores has the volume and velocity of people data in their systems to support data-informed workforce decisions of HR and business leaders. Each store and warehouse leader should have basic workforce data and analytics that are a couple of clicks away. Additionally, those leaders should be expected to use data and insights when making workforce decisions. It is not a minor undertaking to change the decision-making fabric of a large organization. There are multiple levels of the organization involved in this type of change, and naturally the right tools and support are essential. Starting at the top, the C-suite must set the expectation by integrating data into workforce decisions in ways that are visible to their teams. CHROs need to promote and hire HR retail leaders with analytical aptitude and curiosity. They need to drive change into their own function and groom HR business partners who truly shape the business. This is impossible without integrating people data and providing vehicles for sharing it. Organizations that have done this right have built a People Analytics Center of Excellence or similar sub-function within the Human Resources department. But the people analytics COE is not simply a reporting or HR tech team. The people analytics team is dedicated to driving better, more data-informed talent decisions at all levels of the organization through content, products, insights, and advanced analytics. Data is the foundation of any people analytics COE. It is crucial that data from multiple HR and non-HR data sources is integrated to create a high-quality data asset. Maturity in people analytics should be considered in organic, non-linear terms. You should not plan to perfect the data first, then proceed to reporting, then to analytics, as you may see in a Gartner maturity model. Instead, first gather as much data as you need to create the content and analytics that will generate the most decision-making value for the organization. Then repeat the process for additional decision domains. People Analytics Questions to Consider for Workforce Decisions: Which Data, Where, and How? Evaluate your company’s top priorities to determine what data is necessary to improve talent decisions. Locate both the HR and non-HR data required to inform those decisions. Explore how these multiple sources can be integrated. Determine which data sources are easily integrated and which integrations require heroic efforts. The Capabilities Ask how capable your HR function is in leveraging data for talent decisions. Ask if your business leaders are prepared to do the same. The Tools. Determine if a visualization and reporting platform is available to extend people analytics content across the entire organization. Is it flexible enough to accommodate our needs as they evolve? The retail industry is facing a dramatic inflexion point in its ability to make brilliant talent decisions that propel profit growth, reduce risk, and deliver an incredible employee experience in which people thrive. The key challenges of HR in the retail sector cannot be understated. The next era of people analytics for the retail industry is now – and you are here to lead it. We’re looking forward to exploring this with you. Would you like to learn more about people analytics obstacles in retail and how we can solve them? Sign up for a Demo:
Read Article
Featured
10 min read
Phil Schrader
Post 1: Sniffing for Bull***t. As a people analytics professional, you are now expected to make decisions about whether to use various predictive models. This is a surprisingly difficult question with important consequences for your employees and job applicants. In fact, I started drafting up a lovely little three section blog post around this topic before realizing that there was zero chance that I was going to be able to pack everything into a single post. There are simply no hard and fast rules you can follow to know if a model is good enough to use “in the wild.” There are too many considerations. To take an initial example, what are the consequences of being wrong? Are you predicting whether someone will click on an ad, or whether someone has cancer? In fact, even talking about model accuracy is multifaceted. Are you worried about detecting everyone who does have cancer-- even at the risk of false positives? Or are you more concerned about avoiding false positives? Side note: If you are a people analytics professional, you ought to become comfortable with the idea of precision and recall. Many people have produced explanations of these terms so we won’t go into it here. Here is one from “Towards Data Science”. So all that said, instead of a single, long post attempting to cover a respectable amount of this topic, we are going to put out a series of posts under that heading: Evaluating a predictive model: Good Smells and Bad Smells. And, since I’ve never met an analogy that I wasn’t willing to beat to death, we’ll use that smelly comparison to help you keep track of the level at which we are evaluating a model. For example, in this post we’re going to start way out at bull***t range. Sniffing for Bull***t As this comparison implies, you ought to be able to smell these sorts of problems from pretty far out. In fact, for these initial checks, you don’t even have to get close enough to sniff around at the details of the model. You’re simply going to ask the producers of the model (vendor or in-house team) a few questions about how they work to see if they are offering you potential bull***t. At One Model, we're always interested in sharing our thoughts on predictive modeling. One of these great chats are available on the other side of this form. Back to our scheduled programming. Remember that predictions are not real. Because predictive models generate data points, it is tempting to treat them like facts. But they are not facts. They are educated guesses. If you are not committed to testing them and reviewing the methodology behind them, then you are contenting yourself with bull***t. Technically speaking, by bull***t, I mean a scenario in which you are not actually concerned with whether the predictions you are putting out are right or wrong. For those of you looking for a more detailed theory of bull***t, I direct you to Harry G. Frankfurt. At One Model we strive to avoid giving our customers bull***t (yay us!) by producing models with transparency and tractability in mind. By transparency we mean that we are committed to showing you exactly how a model was produced, what type of algorithm it is, how it performs, how features were selected, and other decisions that were made to prepare and clean the data. By tractability we mean that the data is traceable and easy to wrangle and analyze. When you put these concepts together you end up with predictive models that you can trust with your career and the careers of your employees. If, for example, you produce an attrition model, transparency and tractability will mean that you are able to educate your data consumers on how accurate the model is. It will mean that you have a process set up to review the results of predictions over time and see if they are correct. It will mean that if you are challenged about why a certain employee was categorized as a high attrition risk, you will be able to explain what features were important in that prediction. And so on. To take a counter example, there’s an awful lot of machine learning going on in the talent acquisition space. Lots of products out there are promising to save your recruiters time by using machine learning to estimate whether candidates are a relatively good or a relatively bad match for a job. This way, you can make life easier for your recruiters by taking a big pile of candidates and automagically identifying the ones that are the best fit. I suspect that many of these offerings are bull***t. And here are a few questions you can ask the vendors to see if you catch a whiff (or perhaps an overwhelming aroma) of bull***t. The same sorts of questions would apply for other scenarios, including models produced by an in-house team. Hey, person offering me this model, do you test to see if these predictions are accurate? Initially I thought about making this question “How do you” rather than “Do you”. I think “Do you” is more to the point. Any hesitation or awkwardness here is a really bad smell. In the talent acquisition example above, the vendor should at least be able to say, “Of course, we did an initial train-test split on the data and we monitor the results over time to see if people we say are good matches ultimately get hired.” Now later on, we might devote a post in this series to self-fulfilling prophecies. Meaning in this case that you should be on alert for the fact that by promoting a candidate to the top of the resume stack, you are almost certainly going to increase the odds that they are hired and, thus, you are your model is shaping, rather than predicting the future. But we’re still out at bull***t range so let’s leave that aside. And so, having established that the producer of the model does in fact test their model for accuracy, the next logical question to ask is: So how good is this model? Remember that we are still sniffing for bull***t. The purpose of this question is not so much to hear whether a given model has .75 or .83 precision or recall, but just to test if the producers of the model are willing to talk about model performance with you. Perhaps they assured you at a high level that the model is really great and they test it all the time-- but if they don’t have any method of explaining model performance ready for you… well… then their model might be bull***t. What features are important in the model? / What type of algorithm is behind these predictions? These follow up questions are fun in the case of vendors. Oftentimes vendors want to talk up their machine learning capabilities with a sort of “secret sauce” argument. They don’t want to tell you how it works or the details behind it because it’s proprietary. And it’s proprietary because it’s AMAZING. But I would argue that this need not be the case and that their hesitation is another sign of bull***t. For example, I have a general understanding of how the original Page Rank algorithm behind Google Search works. Crawl the web and work out the number of pages that link to a given page as a sign of relevance. If those backlinks come from sites which themselves have large numbers of links, then they are worth more. In fact, Sergey Brin and Larry Page published a paper about it. This level of general explanation did not prevent Google from dominating the world of search. In other words, a lack of willingness to be transparent is a strong sign of bull***t. How do you re-examine your models? Having poked a bit at transparency, these last questions get into issues of tractability. You want to hear about the capabilities that the producers of the model have to re-examine the work they have done. Did they build a model a few years ago and now they just keep using it? Or do they make a habit of going back and testing other potential models. Do they save off all their work so that they could easily return to the exact dataset that was used to train a specific version of the model. Are they set up to iterate or are they simply offering a one-size fits all algorithm to you? Good smells here will be discussions about model deployment, maintenance and archiving. Streets and sewers type stuff as one of my analytics mentors likes to say. Bad smells will be high level vague assurances or -- my favorite -- simple appeals to how amazingly bright the team working on it is.If they do vaguely assure you that they are tuning things up “all the time” then you can hit them with this follow up question: Could you go back to a specific prediction you made a year ago and reproduce the exact data set and version of the algorithm behind it? This is a challenging question and even a team fully committed to transparency and tractability will probably hedge their answers a bit. That’s ok. The test here not just about whether they can do it, but whether they are even thinking about this sort of thing. Ideally it opens up a discussion about you they will support you, as the analytics professional responsible for deploying their model, when you get challenged about a particular prediction. It’s the type of question you need to ask now because it will likely be asked of you in the future. As we move forward in this blog series, we’ll get into more nuanced situations. For example, reviewing the features used in the predictions to see if they are diverse and make logical sense. Or checking to see if the type of estimator (algorithm) chosen makes sense for the type of data you provided. But if the model that you are evaluating fails the bull***t smell test outlined here, then it means that you’re not going to have the transparency and tractability necessary to pick up on those more nuanced smells. So do yourself a favor and do a test whiff from a ways away before you stick your nose any closer.
Read Article
Featured
7 min read
Chris Butler
When I first started work with InfoHRM in the people analytics domain back in 2006, we were the only vendor in the space and had been for over a decade. The product was called Workforce Analytics and Planning and after its acquisition by SuccessFactors (2010) and SAP (2012) it's still called that today. So what's the difference? Why do we have Workforce Analytics, HR Analytics, and People Analytics and can they be used interchangeably? I have to give credit to Hip Rodriguez for the subject of the blog. He posted about People Analytics vs HR Analytics a couple weeks ago and I've followed the conversation around it. Hip's Linkedin post here. So what does the data say? Workforce vs HR vs People? Being an analytical person at heart, I turned to the data and analyzed job titles containing "HR, Workforce, People, Human" and "Analytics or Analyst". As you can see in the table below (truncated for space), the data isn't supportive of people analytics being the most popular. In fact, you have to get down to row 25 before you see a people analytics title. HR Analytics and Workforce Analytics related titles are the clear leaders here by volume. Keep in mind though that titles particularly for less senior roles can take time to adapt, especially for more rigid position structures in larger organizations. Likely, many of these junior roles have a more basic reporting focus than an analytics focus. So why then does it feel like People Analytics has become the dominant term for what we do? The Evolution of HR Analytics (and my opinion) I believe that it's not so much a difference between HR Analytics and People Analytics, but rather, an evolution in the term. Let’s Start with the Evolution of Workforce Analytics Early when we were delivering Workforce Analytics it was to only a handful of forward-thinking organizations that also had the budget to be able to take workforce reporting seriously. I specifically say reporting because mostly that's what it was: getting data in the hands of executives and directors weren't happening at scale so even basic data would blow people's minds. It's crazy how often the same basic data still blows people's minds 20 years later. There were not many teams running project focused analysis like there are today. For example: looking at drivers of turnover to trial different retention initiatives or how onboarding programs affect net promoter scores of recent hires. Workforce Analytics was for the most part aggregate reporting. The analysis of this was primarily driven by hardcore segmentation of this data looking for nuggets of gold by a handful of curious people. It was done at scale with large numbers and rarely focused on small populations. A Look at the Difference Between HR Analytics HR Analytics is by far and away the most common term and has lived alongside Workforce Analytics for a very long time now. It is a natural extension of the naming of the human resources department, you're in HR and looking at HR data from our Human Resources Information System (HRIS) you are therefore an HR Analyst. If we were to be more aligned with the term we would be analyzing the effectiveness and the efficiency of the HR function e.g. HR Staffing ratios and everything else that goes along with it. An HR Analyst in this fashion would be more aligned with Talent Acquisition Analyst roles that we see growing in the domain today. In my view, HR Analytics is really no different to Workforce Analytics and we will see these titles transition towards People Analytics over time. Why Evolve to People Analytics Then? I do not believe there is a significant difference between people analytics vs HR analytics vs workforce analytics in terms of the work that we do. The evolution of the terms, in my opinion, has been more about how we view people as individuals in our organizations as opposed to the large scale aggregate of a workforce or even worse to me as "human resources". We've recognized as a discipline that people need to be treated and respected as individuals, that we need to provide career development, and life support, and that it is important that people actually take vacation time. It is treating people as people and not numbers cranking out widgets. It is no coincidence that knowledge worker organizations have been the biggest adopters of people analytics, they have the most to gain especially in the tight labor market where choice and compensation are abundant. The care for workers must exist whereas many years ago it was a different story. I love the fact that we have people analytics teams who are going deep on how they promote a diverse workforce, on how they create career development opportunities. We even have one customer that integrates cafeteria data into their solution to help identify what people are enjoying eating. So is it just a branding change? Yes, and No. Our space has definitely matured, and our capabilities have grown. We've moved from basic reporting and access to data which is now table stakes, to project-based analysis with intent and hypotheses to prove or dispel. People Analytics is a more mature discipline than it ever was but effectively the same activities could roll up under either term. Impacting people's work lives through our analysis of data is ultimately our goal, having in mind that outcome is why we'll see further adoption of People Analytics as a term. We'll see job titles change to reflect this move over time. And I'm certainly not always right, and there are larger nuances between these terms applicable to some organizations. Heather Whiteman gives a good overview of a more nuanced definition here Interested in Learning More? So whether you call it HR Analytics or People Analytics, if you're new to this and want to understand what it can do for an organization, check out the eBook written by Heather White and Nicholas Garbis on Explore the Power of People Analytics for a further dive in this area. Download eBook Today
Read Article
Featured
28 min read
Tony Ashton
[This article is taken from a presentation I delivered as part of a broader session on People Analytics for the Australian Human Resources Institute (AHRI) QLD Analytics Network on 7 October 2021. It uses the presentation slides and accompanying presentation script only slightly modified from the spoken word to fit the written form. thanks, Tony] This article focuses on setting up a people analytics capability in your organization and thinking about what the key challenges are and how to resolve those. Let’s start by talking about data-driven insights, people analytics is the focus. But more broadly, there is an untold number of articles and research papers on the importance of data-driven decision making and my bookshelf is full of these books and papers. And I'm sure yours are too (or your virtual bookshelves). Here is a nice example from Deloitte to set the scene. Of the organizations they surveyed 39% of those have a strong analytics culture, and 48% of those were significantly exceeding their business goals. Compared to those that didn't display a strong analytics culture, only 22% were significantly exceeding their goals. There's a double whammy in terms of the proportional impact that Analytics gives to an organization. But importantly, also here, there's this angle of culture. In the survey, most executives believe that they weren't really that insight-driven as an organization. So there is this challenge between the ability to pull data together, derive insights, and actually make decisions and make that a part of the framework for how business is done. So why is this important to HR? If you can't connect people to business outcomes, then you're really just doing stuff because you think it's a good idea. Doing stuff that you think is a good idea is ok, but it will only get you so far - being able to prove it is a good idea and measure your impact is another thing entirely. The importance of data-driven decision making for HR Using data helps you prioritize your strategies. You can't do everything, so you need to focus your resources on HR. Metrics and data help you do that and this helps you build that culture of data-driven decision making. If you think about the people space, people related decision making and HR processes are all underpinned by principles like merit, natural justice, fairness and transparency. Without a data-driven approach to this, you're very kind of at risk of replicating the same diversity issues that you see in many organizations: pay equity issues, how promotions and pay increases are awarded, or who should be hired. Some of these examples are on a macro scale, for example, your whole company's diversity profile, and some at the micro-level, but the same general principles apply. Data is important for setting strategy and for tactical decision making At the micro-scale let’s take a specific example of an individual hiring decision. We have selection criteria for hiring to ensure we get the right person for the role and use multi-source inputs to the process to base decisions on evidence and to avoid bias, nepotism, discrimination etc. At the micro-level, you heavily rely on good processes, training and company culture. Ideally, guiding these processes and strategies would be a great analytical understanding of your organization’s diversity profile, the skills and capabilities required for the next 2-3 years, market pay rates for similar roles, the complexity of the role, expected time to productivity… and on. Hiring strategies in the absence of this data are going to be much less effective than they would otherwise be. The reality is that HR has been somewhat late to the party around the use of data and people analytics. If we think about this from a simple business accountability perspective HR teams are custodians of lots of systems. Not many organizations have just one system. And even if you have just one, we still have to curate and care for that information. It's a rich asset to the organization. Putting data in the hands of managers is critical for creating a data-driven culture Let’s look at some research from the Annual HR Systems survey. This survey provides a rich set of longitudinal research and here I’ve highlighted some insights they developed regarding the differences between organizations that are data-driven compared to those that are less so. This construct is similar to the Deloitte research we talked about earlier. The bars on the left of the chart are the results for organizations that are described as not being data-driven, and bars on the right are those that are identified as being data-driven. As you would expect all segments on the right-hand side are higher than on the left, but by far the biggest difference and the thing that really stands out as being different is the deployment of information to managers, putting information in the hands of decision makers. I have circled this segment in red on the chart. So, this gives us something to think about in terms of what drives success. Success isn't necessarily having a great dashboard, success is determined by whether or not people are using data and making decisions with it. The “maturity” of people analytics There has been a lot written on this topic across the decades, there are more books and research papers than you can imagine. Just a few examples here. This is extremely well-trodden terrain and there is no shortage of great information to draw from. Facilitating the Utilization of HR Metrics – The Next HR Measurement Challenge; Irmer, Bernd E (Ph.D); Ellerby, Anastasia (MBA); Blannin, Heather, 2004 Early research on driving the adoption and use of people data In terms of the topic of adoption, this is a key theme for this discussion, and the focus is on the actual use of data in organizations. The image above is an extract from a paper published around 2004 by the InfoHRM company in partnership with the Corporate Leadership Council within the Corporate Executive Board (subsequently acquired by Gartner). This research identified the key phases of sophistication around the use of HR data for business impact. The phases were characterized as: getting your house in order by automating reporting and reducing the load from ad-hoc queries by introducing self-service starting to use more advanced metrics and multidimensional analysis, and then deploying more broadly into everyday decision-making and impacting business outcomes. Through a detailed survey and interview process companies self-identified into one of these categories regarding the maturity of their use of HR data. There was a big difference in what was required of companies in phase three and we will talk more about this. Facilitating the Utilization of HR Metrics – The Next HR Measurement Challenge; Irmer, Bernd E (Ph.D); Ellerby, Anastasia (MBA); Blannin, Heather, 2006 Two years later, the study was re-run and the framework was updated based on more findings and longitudinal data. There was an even stronger focus on understanding how to drive adoption and found that there was the dotted line after phase two was even more pronounced and that it was a really hard barrier for companies to jump across. The research provides a number of tactics and best practice advice to address this, but it was clear that having the technology to help scale, automate, improve quality etc. is necessary, but not sufficient for success and success takes good change management, cultural alignment and business impact orientation. It is the latter topics that also drove the creation of the additional phase, i.e. those companies that were truly having an impact on business outcomes through the use of HR data. This research was happening at a time when HR itself was heavily focused on the prevailing thought leadership of Dave Ulrich around HR and business alignment and other leading work by Mark Huselid, Brian Becker, Richard Beatty, John Boudreau, and others. A big part of being a business partner and a business driver was the use of data and evidence-based decision making. Maturity models enter the mainstream Interestingly, around a similar timeframe, Gartner was building its own model for how companies could be more data-driven, and the use of business analytics across an organization [the earliest reference to this I can find is from 2009]. Gartner’s framework described this in the form of a four-phase model describing increasing difficulty for companies to move from descriptive analytics up to being able to deliver prescriptive analytics for the highest value. Bersin & Associates (since acquired by Deloitte) published this model around 2010. As you can see lots of similarities to what has come before and presents a maturity scale of using people analytics starting from operational reporting through advanced reporting, advanced analytics and up to predictive analytics. Defining success While these models have helped companies and people analytics teams assess where they are and the opportunities to make more of a difference, I have a problem with all of these models. The problem is that they set up prescriptive or predictive analytics as the main destination everyone should be striving for and if you're not doing predictive analytics, then you're not really doing anything of worth and setting up an expectation that is hard to reach and not necessarily the right destination. Something I'd recommend considering is how you see success and what matters to your organization is the most important thing, how you get there is just a part of the journey. Build a sustainable capability and avoid the key person dependency risk So, what do we need to do? Many of you, myself included, may have had a role that could be characterized as “the Excel ninja” in your organization, or HR team. You are able to crunch through data, get data from lots of places, massage it, put it together, create some amazing reports and dashboards, and share them around. But then if someone wanted to see that data cut differently, that became a pile of work and maybe your weekend. This is all great for job security and feeling important and needed, but before long you get bored or burnt out, or both. And then you leave or move on to another role. You may have written some great handover notes, but there is an immense amount of tacit knowledge locked up in your brain and everyone likes to do things their own way, so the next person would invariably reinvent everything. In between times, it is probably hard to fill the role, because people with the right skills are scarce in HR. Basically, relying on the Excel Ninja isn’t a great idea for any company as at some point all their people's analytics capability is going to walk out the door and they have to start again. Data Scientists are amazing, but you need to build a broader people analytics team The lesson there is around building sustainable capability, not just relying on a single person. Now, get ready for a feeling of déjà vu. We are in a very similar position today with the role of the data scientist. Everyone wants to be at the top of the maturity scale right? So, how to get there, just hire a Data Scientist! But, you are actually creating a much worse problem than you had with the Excel Ninja. The data scientist is definitely a superhero and is able to do amazing things. But, as before, if you rely on only one person, you're at risk of not creating a sustainable capability for your organization. It is compounded here too, because 80% of the time the Data Scientist is cleaning and aligning datasets and curating predictive models. Most of the time this work is not repeatable and is designed for specific investigations, which can result in great insights, but pretty soon they get fed up and move on and you are left with a massive hole in your people analytics capability yet again. https://www2.deloitte.com/us/en/pages/human-capital/articles/people-analytics-and-workforce-outcomes.html Deloitte has been doing some nice work around evolving this thinking to be more focused on capability creation, as opposed to an escalating pathway of sophistication. Peter Howes webinar discussing this and other related topics: https://www.onemodel.co/events/peter-workforce-planning-webinar Reviewing all of this material I was reminded of a webinar I hosted in 2019 with Peter Howes. As many of you know, Peter is a giant in the industry. He was the founder of Infohrm and a pioneer in strategic HR, and HR systems, a speaker, educator, and author - a true thought leader in every sense of the term. Peter created this model in around 1980. His core principles are all still valid today and remain probably one of the best characterizations of people analytics done well that I have seen. Essentially, if your team is wrapped up in administrative tasks, you should aim to shift the mix to include professional and strategic activities for greater business impact. You still need to do operational and tactical reporting, that never goes away. Getting greater efficiency and automation for these activities frees you up to do work with greater business impact. The biggest challenges with People Analytics It is pretty clear that the challenges for adoption of people analytics have been around for a couple of decades now, and while technology has caught up with our desires, there is a lot more to success in harnessing technology and developing a sustainable people analytics capability. 1. Data is spread across multiple systems Even if your company has purchased an HRIS suite, you will still have issues pulling data together from across those different applications and invariably you will also have data in lots of different systems. You spend 80% of your time assembling data and probably no more than 5% of your time doing true insight generation. 2. Data is not trusted by leaders If someone doesn't like the message they are hearing from HR, they're going to attack the data. If you have any data quality issues, then that's going to show and it will undermine everything. Even if the inaccuracy is minor and doesn’t affect your message, it is an opening - a weakness. People will start generating their own data, and use different definitions, resulting in a lack of consistency and trust. 3. Analytical tools are not being adopted If your tools are too complex, then they won't be used. This is why many tools don't get used in most organizations, not just people analytics products. There are too many options, too many things to click, and that is a barrier to adoption. Focusing the solution on the real needs of the different users and personas is critical. More is not better in this case, focused insights and fewer options for the end-user is what will bring success. 4. Data security & privacy is really complex Obviously, in HR, security and privacy are critically important, and often a major reason why people data is not shared around organizations. Think back to the life of the Excel Ninja, they are probably generating hundreds of different spreadsheets and emailing them to managers. This is a lot of work, but it is also inherently risky. 5. High expectations for Data Science and AI/Machine Learning Machine Learning (ML) and Artificial Intelligence (AI) is seen as being too futuristic for most despite the incredible amount of hype. “How do we even get started?” is an all too common refrain. 6. Data and predictive models in HR apps are very “black box” Predictive models and even basic data transformation models are often locked in the head of your Excel Ninja, or in a black box from your software vendor. This means you have a lack of transparency in understanding if there are any quality issues in the movement of data and calculations, or if you have an inherent bias, or how reliable and trustworthy those models are. Back to where we began this discussion, if you are making decisions that impact people’s lives, you need to have good reliable evidence. Solving these challenges is necessary for success. So how do we actually do that? Let's talk through some tactics and ideas. Solving the People Analytics challenges STEP ONE - Bring your data together Naturally, bringing your data together is step one. Ideally, into a single data model, or if not, at least a repeatable process for merging your data together so you don't have hands involved in the process. This is really important, because if you have any manual processes you are again spending time on less value adding work taking you away from insight generation, and it's opening up opportunities for errors. STEP TWO - Create a set of key metrics and definitions Creating a set of metrics and a set of definitions is really important, because then you've got consistency. You can then drive reliability and quality through that process. STEP THREE - Deploy simple, guided storyboards/dashboards & data exploration tools Then with a set of defined metrics and storyboards (or dashboards, or whatever you call it) that are consistent and easily understood, you are able to start driving adoption. People get familiar with the frame you are presenting, the terms and the language, and the definitions. This brings a baseline of shared understanding and learning and the ability to then start adding to that through time. STEP FOUR - Wrap everything in role based security from the start In terms of security, you should think about security through the concept of user personas for which you construct roles, not thinking about security for individuals. Think about your Executives, GMs, People Leaders, HR Business Partners (HRBPs) etc. and what the different roles are, what data they need to see and then craft the security around the roles. This allows you to set your data free using security as a way of deploying content, not holding it back. Drowning in spreadsheets is often seen as a problem for data consistency, effort etc. but it is also a major security issue that can be avoided by taking this approach STEP FIVE - Leverage technology and skills to enable the use of ML/AI & predictive insights The technology issues around ML/AI are completely solvable. There is lots of technology available and it is not really a technology problem anymore. It is more an issue of capability and understanding. The key is to leverage technology in a scalable way and not fall into the key person dependency trap. STEP SIX - No magic allowed - make everything fully transparent & explainable This leads into the last point, which is don't allow the use of black magic and closed systems. Make sure that everything is explainable and understandable when it comes to metrics and predictive models or whatever kind of analysis that you're doing. Some practical examples of People Analytics in practice Let me share a couple of quick examples. Here is a storyboard that is structured around a specific topic and has the key questions your audience would be asking and these leading them through the data. So, it’s really easy to understand what's going on with layered complexity from the high level summary trends through to the details. Everything is interactive, you can click and drill. We are leading people through the topic pre-empting the questions that commonly arise when consuming this content. Below is an example using more of a classic KPI style Storyboard. Here you can assemble and browse through the KPIs from the simple to the more advanced, but the layout is consistent and easy to track from the big headline number through the trends and the detailed breakdowns. At any point, you can click and drill. One of the most important features here is this pervasive library of formulas, definitions and explanations. As important is the ability to drill into the details and see who the people are for this analysis (naturally all this is seamlessly controlled by role based security). The ability to drill down lets you validate the information, but also gets you into action. You are able to quickly dig into key employee segments, identify risks and target interventions. These are just a couple of these examples of what you can do to get started fairly simply, but quickly make a big difference in your organization. Building on the previous examples, in the scatterplot below we have added a correlation, which is normally something scary for the average non-statistician, but if you look at the text above the chart you can see an automatically generated written interpretation of the results using simple business language. Instead of just providing the numbers and expecting people to understand what a correlation coefficient is, or how to interpret significance, be explicit and explain whether something is significant or not – this goes a long way. Here is a zoomed in view so you can see this more clearly - the chart heading is in the form of a question and the text is directly answering this question. A Summary of Tactics to Build your People Analytics Capability The slide above summarizes some of the tactics we have covered, with a few additions to help you build a people analytics capability in your organization. If you don’t have the skills in HR, borrow from other disciplines, find the experts in the organization who can help you. Reach out to the broader people analytics community. There are lots of resources, networks and people ready to help. The People Analytics practice and network is bigger now than it has ever been. Also remember that it's not always just about the data, you're in HR, let's talk to people, be sure to check your findings, go around the organization and build your own network to better understand what's actually happening. Some final thoughts By way of some final thoughts. Focus on the questions that matter to your business, start with a small set of things that are repeatable and build trust. This will then give you time to do the more interesting stuff, find opportunities to drive success, and then market your successes. You can build a groundswell of people wanting to get analytics as opposed to you forcing it upon them. And again, it's about insight, not necessarily just about the data, but the actions you can take and the impact you can make. People Analytics is one of the hottest areas that organizations are looking to hire into internationally. The above framework is designed to help you put all this into practice. You need to deal with the job of data orchestration to get all of your data into one place and one logical construct. Focus on Storytelling, not just generating Dashboards. Blend Predictive Analytics into this and some think of it as an add-on. Answer the questions that matter If you are interested, One Model has heaps of assets that we can share with you. For example, contact us if you want some inspiration around the questions that matter. We have a great library of these and this is a really engaging way to talk to people in your organization about people analytics in a non-technical way. We also have an e-book titled "Explore the Power of People Analytics" that’s a great resource to get started.
Read Article
Featured
11 min read
Nicholas Garbis
Only humans would bother inventing something as complex as the concept of species. Attempting to organize every living thing around us into distinct buckets has been a massive and never-ending enterprise. We like categories and we love to argue about them! What's the Difference between People Analytics and Reporting? HR Reporting and People Analytics are intertwined concepts which are more valuable when they are clearly articulated for their distinct purpose and value to the organization. Both are necessary for the effective management of the workforce and the HR processes that are aiming to achieve efficiency and employee/manager experience. HR Reporting and People Analytics have been debated as being the same, as being different, as being parent-child and child-parent. So what’s all the fuss, and how can I offer some thinking that helps? Stirring up muddy water to make it become clear seems foolish, but here it goes. Consider the word origins of the two key words: “Report” is based on the word “port” which means to carry something from one place to another. Reports share information. “Analytics” is based on the word “analyze” which means to decompose and recombine something into something that increases understanding. Analytics facilitates insight. Both HR Reporting and People Analytics are built from a foundation of data that is generated by the multitude of systems-processes that exist within every organization. (See image below) Most systems are built to facilitate processes (eg, hiring) and generate data as a valuable by-product (eg, job open and closed dates). Other systems such as survey systems exist solely for the purpose of generating valuable data. What is HR Reporting? HR Reporting can be generalized based on the traits it most commonly displays (accepting that these lines can be blurry, just as some plants can behave in ways that are quite a bit more like animals). HR Reporting typically is ... Designed to provide information (versus insights) Simple in format, often as list or tables, possibly in a multi-tab spreadsheet Data is often from a single system (often, not always) Raw data, occasionally with calculations applied (ie, metrics) Rather fixed in structure and limited in terms of user interactivity Used in monitoring transactional activity (eg, list of currently open job postings) A source of data that is extracted for analysis in another tool (eg, Excel) Could be a part of a larger people analytics project HR Reporting is used by: Process and technology owners (eg, recruiting ops) HR functional leaders (eg, learning or talent management leaders) People leaders (eg, managers of teams) Executives (eg, C-Suite, CHRO, VPs, DEI leaders) HR Reporting is Valuable HR Reporting is an essential point of access to the data within a given system, enabling the owners of the related processes to retrieve data for review and analysis. I can’t think of a system that doesn’t provide at least some access to the underlying data via reporting. Reporting from these systems is typically organized into some set of pre-defined tabular views, each of them providing users with some options to filter the data to specific parts of the organization or process steps that they want to view (eg, course registrations for the Finance department). Metrics, which are calculations based on the raw data, may also appear in reports but will tend to be a summarization of the transactional data (eg, average time-to-fill for each recruiter). Trends of the data, perhaps displayed with different time periods in different columns, are also common and valuable. Human Resources Reporting is distinguished from “analytics” because analytics tends to be aimed more at generating insights rather than sharing information. Let’s look at People Analytics next. What is People Analytics? People Analytics is more complicated to define. To begin with, it can represent: category of deliverables (eg, interactive dashboards), a team within the HR organization (eg, PA COE), a set of activities (ie, consulting & advising), or a combination of all of these. For comparison with HR Reporting, which is a type of deliverable, we will focus on People Analytics as defined as a category of deliverables. Maturity Continuum? No, Sorry. HR Reporting and People Analytics do not belong on a maturity continuum, as they are both vital parts of running an organization well. Sure, if an organization has no People Analytics, you could confidently say they are less mature than another organization that does. You could even say that one organization’s People Analytics deliverables are more advanced (ie, mature) than another organization. The point is, you don’t move from one level (HR Reporting) to another level (People Analytics) -- you need to deliver both and do them well, even if we agree that People Analytics will create more value for the organization. Here’s a chart that may help orient the two: Notice the difference in the objectives of each: People Analytics will be focused on generating insights. In fact, some advanced analytics solutions will have insights directly within the solution, but most often the insights are expected to occur when the user views and interacts with the content in the deliverable. The value of People Analytics is more in the strategic realm, whereas HR Reporting generates more operational value focused on delivering information to keep the business running. Of course, there is some crossover, but generally, reporting helps with operational items such as efficiency, process monitoring and improvement, auditing, quality control, etc. Analytics is aimed at generating insights that will lead to decisions and actions. Analytics content is designed to facilitate valuable insights “at the speed of thought”, and in online settings this is achieved through interactive user experiences, issue highlighting, embedded insights using natural language. Analytics content facilitates hypothesis generation and testing simultaneously and is a learning and discovery vehicle for users. Deliverables: People Analytics Projects, Products, and Services Let’s outline People Analytics deliverables in terms of products, projects, and services. These are all aimed toward generating insights at scale that will drive the best quality, data-informed talent decisions. Systems and technology are not listed here because they are not deliverables, but enabling elements that help generate the deliverables. Products analytics content is most often distributed online via an analytics platform (like One Model), including metrics that may be sourced from multiple systems, and sometimes will have output from AI/ML predictive models. Dashboards / Storyboards -- an interactive collection of metrics with an explicit design goal of generating insights by or for the user. Some of this may be data science (AI/ML) results that are packaged for broader consumption. “Storyboards” are a variation designed, often with a question format, to elicit a story or path of thinking. Embedded Data Science -- AI/ML and modeling results that use HR data and are embedded within other products (eg, time-to-fill prediction that is consumed by recruiters directly within the recruiting system/ATS, or a restaffing projection rate within a project planning solution). HR Reporting -- while this is not ‘analytics’ in our working definition, it’s important to recognize that these deliverables are often part of the People Analytics team’s responsibility, so they are listed here so as to not be forgotten. Projects Deep-dive studies: covering a given topic, possibly testing a hypothesis, usually culminating in a presentation or delivered document which contains data, metrics, visualizations, written insights, and even conclusions and recommended action steps. These may include advanced analytics and data science methods. Experiments and explorations: digging into the data to understand relationships further, test hypotheses, generate mock-up content that may go into productions, etc. Services Evaluation: creating learning opportunities to elevate the analytics skills of partners (eg, HRBPs) in generic terms or specific to the People Analytics platform. Change Management: developing communications and user engagement plans to help drive adoption of tools and methods. Consulting: offering guidance on strategic decisions and programs that may be evaluated in response to insights generated through the PA team’s products and projects. From Data to Deliverables Let’s return to the diagram we shared in the previous section and expand on it a bit. The systems and processes that generate data are foundational to both. Data from the multiple systems is selectively extracted into a data layer where data from multiple systems is integrated. This layer can be a standalone data warehouse (eg, in Azure or Snowflake) or can be part of a solution. Metrics are calculated (eg, headcount, turnover rate) and dimensions are created (eg, organization unit, company tenure) by applying business rules against the data. Dashboards and Storyboards are designed and developed within a visualization layer (eg, Tableau) or within a People Analytics solution. Data Science will be done using data extracted from the warehouse into tools such as R or Python, or within the data science module in a People Analytics solution (only available in One Model at time of this article). Analytics Projects will be combining elements of all the underlying pieces in what will become a presentation (written and/or verbal), usually on a key topic of interest to leadership. Concluding Thoughts As demonstrated above, HR Reporting and People Analytics are intertwined concepts which are more valuable when they are clearly articulated for their distinct purpose and value to the organization. Both are necessary for the effective management of the workforce and the Human Resources processes that are aiming to achieve efficiency and employee/manager experience. You don’t need to have perfect reporting before you begin doing analytics. The two can mature in tandem and are often mutually reinforcing. A robust, integrated, and flexible data foundation is going to provide the greatest value by ensuring the analytics deliverables do not ‘hit a ceiling’ where the next tier of value becomes unachievable without going back to the architecture of the data warehouse. Think “value first.” Obsess about how your team can generate the most value at the fastest pace for your organization, not about the arcane differences between commonly used terms. To learn more about people analytics, download a free copy of the eBook Explore the Power of People Analytics (value $8.99 in paperback on Amazon) I co-authored with Heather Whiteman, PhD. Download My Free Copy
Read Article
Featured
2 min read
Chris Butler
One Model took home the Small Business Category of the Queensland Premier's Export Awards held last night at Brisbane City Hall. The award was presented by Queensland Premier and Minister for Trade, Hon Annastacia Palaszczuk MP and Minister for Employment, Small Business, Training and Skills Development, Hon Dianne Farmer MP. “We are delighted to receive this award given the quality of entrepreneurs and small business owners in Queensland,” One Model CEO, Chris Butler said. “It is a tribute to the exceptional team we have in Brisbane and the world leading people analytics product One Model has built.” “From our first client, One Model has been an export focussed business. With the profile boost this award gives us, we look forward to continuing to grow our export markets of the United States, Europe and Asia,” Mr Butler said. Following this win, One Model is now a finalist in the 59th Australian Export Awards to be held in Canberra on Thursday 25 November 2021. One Model was founded in Texas in 2015, by South-east Queensland locals Chris Butler, Matthew Wilton and David Wilson. One Model generates over 90% of its revenue from export markets, primarily the United States. One Model was also nominated in the Advanced Technologies Award Category. One Model would like to congratulate Shorthand for winning this award as well as our fellow finalists across both categories - Healthcare Logic, Tactiv (Advanced Technologies Category), iCoolSport, Oper8 Global, Ryan Aerospace and Solar Bollard Lighting (Small Business Category). The One Model team would like to thank Trade and Investment Queensland for their ongoing support. To learn more about One Model's innovative people analytics platform or our company's exports, please feel free to reach out to Bruce Chadburn at bruce.chadburn@onemodel.co. PICTURE - One Model Co-Founders Chris Butler, Matthew Wilton and David Wilson with Queensland Premier, Hon Annastacia Palaszczuk MP and the other award winners.
Read Article
Featured
11 min read
Chris Butler
This week One Model was delighted to participate with an elite group of our industry peers in the HR Tech Alliances, Virtual Collaboration Zone - Best New People Analytics Solution competition. I'm excited to share some detail on what the judges saw to justify the outcome. This wasn't an empty competition either and had some significant companies in the field. The overall scores were as below: 1st - One Model - 4.28 2nd - activ8 intelligence 4.06 3rd - Visier - 3.93 Given how proud I am of our team for winning this award, I thought I would share our presentation. Before I do that, I would like to acknowledge how far the pure play people analytics space has come in recent time. As an industry, this is something that we should celebrate as we continue on a path of innovation to deliver better products and better outcomes for our clients. People analytics is an exciting place to be as 2020 comes to its (merciful) conclusion! We'll take a quick tour through the highlights of our presentation and demonstration. Who are we? One Model provides its customers with an end to end people analytics platform that we describe as an infrastructure. We call it an infrastructure, because from the ground up - One Model is built to make everything we do open and accessible to our most important stakeholder - you the customer. Everything from our data models to our content catalogues, right down to the underlying data warehouse is transparent and accessible. One Model is not a black box. Over the last five years, we have been guided by the principle that because of One Model’s transparency and flexibility - our customers should feel as if this is a product that they built themselves. Our History For those of you who are unfamiliar with the history of One Model, the core of our team is derived from workforce analytics pioneers InfoHRM. InfoHRM was acquired by Successfactors in 2010 and subsequently SAP in 2012. During our extraordinary ride from humble Australian business to integral part of one of the world’s largest software companies - our team learned that while our solution gave low maturity users what they needed in terms of the what, why and how of measuring their workforce. Our solution remained an inflexible tool that customers outgrew as their own capabilities increased. With an increased sophistication, customers were asking new and more complicated questions and the solution simply couldn't evolve with them. Five years later and sadly, this is what we continue to see from other vendors in our space. Meeting our customers where they are on their people analytics journey and supporting them through their evolution is fundamental to the One Model platform. Be Open; Be Flexible; Don't put a ceiling on your customers capabilities. One Model takes care of the hard work of building a people analytics infrastructure. We built One Model to take care of both low maturity users, who need simple and supported content to understand the power of people analytics. At the same time, we need to deliver an experience that customers grow into and higher maturity users can leverage world-leading One AI data science and statistical engine. Furthermore, if they want to use their own tools or external data science teams - their people analytics platform should enable this - not stand in its way. One Model’s Three Pillar People Analytics Philosophy Pillar 1: Data Orchestration People data is useless if you can’t get access to it. Data orchestration is critical to a successful people analytics program. At One Model - Data Orchestration is our SUPERPOWER! Many thousands of hours have been invested by our team in bespoke integrations that overcome the native challenges of HR Tech vendors and provide full, historic and transactional extracts ready for analytics. This isn’t easy. Actually, it’s terrifyingly hard. Let’s use Workday as an example; To put it mildly, the data from their reporting engine and the basic API used to download these reports is terrible. It's merely a snapshot that doesn't provide the transactional detail required for analytics. It's also impossible to sync history as it changes over time - an important feature given the nature of HR data. You have to go to the full API to manage a complete load for analytics. We are 25,000 hours in and we're still working on changes! To power our data orchestration, we built our own Integrated Development Environment (IDE) for managing the enormous complexity of people data and to house our data modelling tools. Data quality and validation dashboards ensure we identify and continue to monitor data over time for correction. Data destinations allow us to feed data out to other places, many of our customers use this to feed data to other vendors or push data to other business units (like finance) to keep other business units up to date. Unlike garden variety Superpowers (like flying), our data orchestration capability did not develop by serendipity or luck. It developed and continues to develop by the hard work and superior skills of our team. Pillar 2: Data Presentation Most other vendors in our space exist here. They don't provide open and flexible toolsets for Data Orchestration or Value Extraction / Data Science. When we started One Model, we hadn't planned on a visualization engine at all. We thought we could leverage a Tableau, Looker, or Birst OEM embedded in our solution. After much evaluation, we just couldn't deliver the experience and capability that analyzing and reporting on HR data requires. Generic BI tools aren't able to deliver the right calculations, with the right views across time, in a fashion that allows wide distribution according to the intense security and privacy needs of HR. We had to build our own. Ultimately our vertical integration allows unique user security modelling, integration of One AI into the frontend UI, all while not limiting us to the vagaries of someone else's product. Our implicit understanding of how HR reports, analyzes, and distributes data required us to build a HR specific data visualization tool for One Model. Pillar 3: Data Science / Value Extraction - One AI I like to describe the third pillar of our people analytics philosophy as our 'Value Extraction' layer. This layer is vertically integrated on top of our data models, it allows us to apply automated machine learning, advanced statistical modelling, and to augment and extend our data with external capabilities like commute time calculations. Predictive capabilities were our first target and we needed to build unique models at scale for any customer, regardless of their data shape, size, or quality. A one size fits all algorithm that most other vendors in the HR space provide wasn't going to cut it. Enter automated machine learning - Our One AI capability will look across the entire data scope for a customer, it will introduce external context data, select it's own features, train potentially hundreds of models and permutations of those models and select the best fit. It provides a detailed explanation of the decisions it made, enough to keep any data scientist happy. The best of all these models can be scheduled and repeated so every month it could be set to re-learn, re-train, and provide an entirely different model as a fit to your changing workforce. This unbelievable capability doesn't lock out an experienced team, but invites them in should they wish to pull their own levers and make their own decisions. The One AI engine is now being brought to bear in a real time fashion in our UI tacking forecasting, bayesian what if analyses, bias detection, anomaly detection, and insight detection. We have barely scratched the surface of the capability and our vertical integration with a clean, consistent data model allows these advanced tools to work to deliver the best outcomes to customers. Labor Market Intelligence One Model has the world’s best understanding of your internal HR data set; we do wonderful things with the data you already have - but we were missing the context of the external labor market and how that impacted our customer's workforces. As a result, we have developed a proprietary Labor Market Intelligence (LMI) tool. LMI is being released in January 2021 as a standalone product providing labor market analytics to our customers. LMI retains the functionality that you love about our people analytics platform - the ability to flexibly navigate data, build your own storyboard content, and drill through to granular detail. Importantly what LMI will allow for One Model enterprise customers is the ability to link external market data to internal people data. Delivering outcomes like identifying persons paid lower than the market rate in their region, identifying employees in roles at risk of poaching due to high market demand and turnover, and helping you understand if your talent are leaving for promotions, or lateral moves. Collaboration with the HR Tech Ecosystem Finally, One Model understands the power of collaboration in the HR Tech ecosystem. We are already working with leading consultancies like Deloitte and are embedded in HCM vendors helping consume and make sense of their own data to deliver people analytics and extract value for their customers. At the end of the day, our vision is to understand the entire HR Tech ecosystem at the data layer, to help customers realize their investment in these systems, and to provide a data insurance policy as they transition between systems. Analytics is a by-product of this vision and thankfully it also pays the bills ;)
Read Article
Featured
3 min read
Nicholas Garbis
As part of a recent People Analytics course from the Future Workplace, Nicholas Garbis joined forces with course leader Heather Whiteman, PhD to co-author an eBook on People Analytics called "Explore the Power of People Analytics: A Guide for Business and HR Leaders". While the book was specifically aimed at a general HR and business leader audience, we quickly found that a number of well-accomplished People Analytics leaders were getting value out of it as well. Whereas some of the HR and business leaders may be entering this content for the first time, the more mature people analytics leaders are always searching for that same introductory content that can help them to increase understanding and adoption of their team's work. We are here to accelerate you people analytics journey. As titled, the aim of the eBook is to "explore" the topic of People Analytics. In terms of a journey, this is a guidebook that highlights various "points of interest" that make the journey interesting and worth pursuing. Download the eBook (.pdf) Explore the Power of People Analytics We hope this eBook sparks ideas for how you can apply people analytics in your organization and makes it more accessible for your teams. We invite you to start building greater capability in this area so you can take advantage of the opportunities people analytics makes possible. Paperback edition is available on Amazon.com. About One Model One Model delivers a comprehensive people analytics platform to business and HR leaders that integrates data from any HR technology solution to deliver metrics, storyboard visuals, and advanced analytics through a proprietary AI and machine learning model builder. People data presents unique and complex challenges which the One Model platform simplifies to enable faster, better, evidence-based workforce decisions. Learn more at www.onemodel.co. One Model’s new Labor Market Intel product delivers external supply & demand data at an unmatched level of granularity and flexibility. The views in LMI help you to answer the questions you and your leaders need answers to with the added flexibility to create your own customized views. Learn more at www.onemodel.co/LMI.
Read Article
Featured
31 min read
Chris Butler
The first in a series of posts tackling the individual nuances we see with HR technology systems and the steps we take in overcoming their native challenges to deliver a comprehensive people analytics program. Download the White Paper on Delivering People Analytics from SAP SuccessFactorsQuick Links A long history with SuccessFactors Embedded Analytics won't cut it, you have to get the data out World leading API for extraction Time to extract data Full Initial Load Incremental Loads Modelling Data Both SuccessFactors and External SF Data Modelling Analytics Ready Fact Tables Synthetic Events Core SuccessFactors Modules MDF Objects Snowflake Schema Inheritance Metrics - Calculations - Analytics Delivered Reporting and Analytics Content Creating and Sharing your own Analytics Content Using your own Analytical Tools Feed Data to External Vendors What About People Analytics Embedded? What About SAP Analytics Cloud? What About SuccessFactors Workforce Analytics? The One Model Solution for SAP SuccessFactors A long history with SuccessFactors I'm starting with SuccessFactors because we have a lot of history with SuccessFactors. SF acquired Infohrm where many of our team worked back in 2010 and the subsequent acquisition by SAP in 2012. I personally built and led a team in the America's region delivering the workforce analytics and planning products to customers and ensuring their success. I left SAP in 2014 to found One Model. Many of One Model's team members were in my team or leading other global regions and, of course, we were lucky enough to bring on a complete world-leading product team from SAP after they made the product and engineering teams redundant in 2019 (perfect timing for us! Thanks SAP they're doing a phenomenal job!). So let's dive in and explore SuccessFactors data for people analytics and reporting. Embedded Analytics won't cut it, you have to get the data out. It's no secret that all vendors in the core HR technology space espouse a fully integrated suite of applications and that they all fall short to varying degrees. The SF product set has grown both organically and via acquisition, so you immediately have (even now) a disconnected architecture underneath that has been linked together where needed by software enhancements sitting above. Add in the MDF framework with an almost unlimited ability to customize and you quickly have a complexity monster that wasn't designed for delivering nuanced analytics. We describe the embedded reporting and analytics solutions as 'convenience analytics' since they are good for basic numbers and operational list reporting but fall short in providing even basic analytics like trending over time. The new embedded people analytics from SF is an example where the data set and capability is very limited. To deliver reporting and analytics that go beyond simple lists and metrics (and to do anything resembling data science), you will need to get that data out of SF and into another solution. World leading API for data extraction One Model has built integrations to all the major HRIS systems and without a doubt SuccessFactors has the best API architecture for getting data out to support an analytics program. Deep, granular data with effective dated history is key to maintaining an analytics data store. It still has its issues, of course, but it has been built with incremental updates in mind and importantly can cater for the MDF frameworks huge customizability. The MDF inclusion is massive. It means that you can use the API to extract all custom objects and that the API flexes dynamically to suit each customer. As part of our extraction, we simply interrogate the API for available objects and work through each one to extract the full data set. It's simply awesome. We recently plugged into a huge SuccessFactors customer of around 150,000 employees and pulled more than 4,000 tables out of the API into our warehouse. The initial full load took about a week, so it was obviously a huge data set, but incremental loads can then be used for ongoing updates. Some smaller organizations have run in a matter of minutes but clearly the API can support small through to enormous organizations, something other vendors (cough, cough ... Workday) should aspire to. To give you a comparison on level of effort we've spent on the One Model API connectors, approximately 600 hours has been spent on SuccessFactors versus more than 12,000 hours on our Workday connector. Keep in mind that we have more stringent criteria for our integrations than most organizations including fault tolerance, maintenance period traversal, increased data granularity, etc., that go beyond what most individual organizations would have the ability to build on their own. The point is, the hours we've invested show the huge contrast between the SF and Workday architectures as relates to data access. Time to Extract data Obviously, the time needed to extract the data depends on the size of the organization but I’ll give you some examples of both small and huge below. Figure 1: Data extraction from SAP SuccessFactors using APIs Full Initial Loads In the first run we want everything that is available -- a complete historical dataset including the MDF framework. This is the most intense data pull and can vary from 20 minutes for a small organization of less than 1,000 employees to several days for a large organization above 100,000 employees. Luckily, this typically only needs to be done once during initial construction of the data warehouse, but there are times where you may need to run a replacement destructive load if there are major changes to the schema, the extraction, or for some reason your synchronization gets out of alignment. API’s can behave strangely sometimes with random errors, sometimes missing records either due to the API itself or the transmission just losing data, so keep this process handy and build to be repeatable in case you need to run again in the future. The One Model connectors provide an infrastructure to manage these issues. If we're only looking for a subset of the data or want to restrict the fields, modules, or subject areas extracted, we can tell the connector which data elements to target. Figure 2: Configuring the connector to SF in One Model platform Incremental Updates With the initial run complete we can switch the extraction to incremental updates and schedule them on a regular basis. One approach we like to take when pulling incrementals is to take not just the changes since the last run but also take a few extra time periods. For example, if you are running a daily update you might take the last two to three days worth of data in case there were any previous transmission issues, this redundancy helps to ensure accuracy. Typically we run our incremental updates on a daily basis, but you want to run more often than this you should first need to consider: How long your incremental update takes to run. SF is pretty quick, but large orgs will see longer times, sometimes stretching into multiple hours How long it takes your downstream processes to run an update any data If there’s a performance impact to updating data more regularly, typically if you have a level of caching in your analytics architecture this will be blown away with the update to start over again. Impact on users if data changes during the day. Yes, there can be resistance to data updating closer to real-time. Sometimes it's better to educate users that the data will be static and updated overnight. Whether or not the source objects support incremental updates. Not all can, and with SF there’s a number of tables we need to pull in a full load fashion, particularly in the recruiting modules. Modelling data both SuccessFactors and External Okay, we have our SF data and of course we have probably just as much data from other systems that we're going to need to integrate together. SF is not the easiest data set to model, as each module operates with its own nuances that, if you're not experienced with, will send you into a trial and error cycle. We can actually see a lot of the challenges the SF data can cause by looking at the failures the SF team themselves have experienced in providing cross-module reporting over the years. There have been issues with duplicates, incorrect sub domain schemas, and customer confusion as to where you should be sourcing data from. A good example is pulling from employee profile versus employee central. The SAP on premise data architecture is beautiful in comparison (yes really, and look out soon for a similar post detailing our approach to SAP on premise). Modeling the SF Data At this point we're modelling (transforming) the raw source data from SF into analytics-ready data models that we materialize into the warehouse as a set of fact and dimension tables. We like to keep a reasonable level of normalization between the tables to aid in the integration of new, future data sources and for easier maintenance of the data set. Typically, we normalize by subject area and usually around the same timescale. This can be difficult to build, so we've developed our own approaches to complete the time splicing and collapsing of records to condense the data set down to where changes occurred. The effort is worth it though, as the result is a full transactional history that allows the most flexibility when creating calculations and metrics, eliminating the need to go back and build a new version of a data set to support every new calculation (something I see regularly with enterprise BI teams). This is another example of where our team's decades of experience in modelling data for people analytics really comes to the fore. During the modelling process there's often a number of intermediate/transient tables required to merge data sets and accommodate modules that have different time contexts to each other, but at the end of the day we end up materializing them all into a single analytics-ready schema (we call it our One schema) of tables. Some of what you would see is outlined below. Analytics Ready Fact Tables One.Employee - all employee effective dated attributes One.Employee_Event - all employee events, equivalent to action/reason events (e.g. Hire, Termination, Transfer, Supervisor change, etc.). Usually you'll need to synthetically create some events where they don't exist as action/reason combinations. For example, many customers have promotions that aren't captured in the system as a transaction but are logically generated where a pay grade occurs alongside a transfer or any similar combination of logic. One.Requisitions - all Requisition's and events One.Applications - all application events One.Performance_Reviews - all performance review events ... the list goes on Dimension Tables One.dim_age - age breakout dimension with levelling One.dim_gender - gender breakout dimension typically a single level One.organizational_unit - The multi-level organization structure … we could go on forever, here's a sample below of fields Figure 3: Examples of tables and fields created in the One Model data schema Synthetic Events A core HRIS rarely captures all events that need to be reported on, either because the system wasn't configured to capture it or the event classification is a mix of logic that doesn't fit into the system itself. These are perfect examples of why you need to get data out of the system to be able to handle unsupported or custom calculations and metrics. A frequently recurring example is promotions, where an action/reason code wasn't used or doesn't fit and for reporting a logic test needs to be used (e.g. a change in pay grade + a numeric increase in salary). We would implement this test in the data model itself to create a synthetic event in our Employee_Events model. It would then be seen as a distinct event just like the system-sourced events. In this fashion you can overcome some of the native limitations of the source system and tailor your reporting and analytics to how the business actually functions. Core SuccessFactors Modules Employee Central - Aligns with our Employee, Employee Event tables and typically includes about 100+ dimensions as they're built out. The dimension contents usually come from the foundation objects, picklist reference tables, an MDF object, or just the contents of the field if usable. This is the core of the analytics build and virtually all other modules and data sets will tie back to the core for reference. Recruiting - Aligns with our Applications, Application_Event, and Candidates fact tables covering the primary reporting metrics and then their associated dimensional tables. Succession - Aligns with Successor and associated dimensions Performance - Performance Reviews (all form types) and associated dimensions Learning - Learning Events, Courses, Participants Goals - Goals, Goal_Events MDF objects MDF objects are generally built into the HRIS to handle additional custom data points that support various HR processes. Typically we’ll see them incorporated into one of the main fact tables aligning with the date context of the subject fact table (e.g. employee attributes in One.Employee). Where the data isn’t relevant to an existing subject, or just doesn’t align with the time context, it may be better to put the data into its own fact table. Usually the attribute or ID would be held in the fact table and we would create a dimension table to display the breakout of the data in the MDF object. For example, you might have an MDF object for capturing whether an employee works from home. Captured would be the person ID, date, and the value associated (e.g. ‘Works from Home’ or ‘Works from Office’). The attribute would be integrated into our Employee fact table with the effective date and typically a dimension table would also be created to show the values allowing the aggregate population to be broken out by these values in reporting and analysis. With the potential for a company to have thousands of MDF objects, this can massively increase the size, complexity, and maintenance of the build. Best to be careful here as the time context of different custom objects needs to be handled appropriately or you risk impacting other metrics as you calculate across domains. Inheritance of a snowflake schema Not to be confused with Snowflake the database, a snowflake schema creates table linkages between tables that may take several steps to join to an outer fact or dimension table. An example is that of how we link a dimension like Application Source (i.e., where a person was hired from) to a core employee metric like Headcount or Termination Rate which has been sourced from our core Employee and Employee Event Tables. An example of this is below, where to break out Termination Rate by Application Source and Age we would need to connect the tables below as shown: Figure 4: Example of connecting terminations to application source This style of data architecture allows for a massive scale of data to be interconnected in a fashion that enables easier maintenance and the ability to change pieces of the data model without impacting the rest of the data set. This is somewhat opposite of what is typically created for consumption with solutions like Tableau which operate easiest with de-normalized tables (i.e., giant tables mashed together) which come at the cost of maintenance and flexibility. Where one of our customers wants to use Tableau or similar solution we typically add a few de-normalized tables built from our snowflake architecture that gives them the best of both worlds. Our calculation engine is built specifically to be able to handle these multi-step or matrix relationships so you don’t have to worry about how the connections are made once it’s part of the One Model data model. Metrics - Calculations - Analytics When we get to this point, the hardest work is actually done. If you've made it this far, it is now relatively straight forward to build the metrics you need for reporting and analytics. Our data models are built to do this easily and on the fly so there isn't a need for building pre-calculated tables like you might have to do in Tableau or other BI tools. The dynamic, on the fly nature of the One Model calculation engine means we can create new metrics or edit existing ones and be immediately using them without having to generate or process any new calculation tables. Creating / Editing Metrics Figure 5: Example of creating and editing metrics in One Model Delivered Reporting and Analytics Content With an interconnected data model and a catalogue of pre-defined metrics, it is straight forward to create, share and consume analytics content. We provide our customers with a broad range of pre-configured Storyboard content on top of their SuccessFactors data. A Storyboard library page allows a quick view of all subject areas and allow click through to the deeper subject specific Storyboards beneath. This content is comprehensive covering off the common subject areas for analytics and reporting such as workforce profile, talent acquisition, turnover, diversity, etc. There is also the ability to create dashboards for monitoring data quality, performing data validations, and viewing usage statistics to help manage the analytics platform. Figure 6: Sample of standard Storyboard content in One Model Creating and Sharing your own Analytics Content Every one of our customers adds to the pre-configured content that we provide them, creating their own metrics and storyboards to tell their organization's people story, to support their HR, business leaders, and managers, and to save their people analytics team time by reducing ad-hoc requests for basic data. Our customers make the solution their own which is the whole point of providing a flexible solution not tied to the limitations of the underlying source system. Content in One Model is typically shared with users by publishing a storyboard and selecting which roles will have access and whether they can edit or just view the storyboard itself. There's a number of other options for distributing data and content including: Embedding One Model Storyboards within the SuccessFactors application itself Embedding One Model Storyboards within Sharepoint, Confluence, or any other website/intranet (e.g. the way we have used frames within this site: https://covidjobimpacts.greenwich.hr/#) Pushing data out to other data warehouses (what we call a "data destination") on a scheduled basis, something that works well for feeding other tools like Tableau, PowerBI, SAP Analytics Cloud, and data lakes. Sharing Storyboards Embedding Storyboards Example of embedded storyboard COVID Job Impacts site - https://covidjobimpacts.greenwich.hr/# Figures 7, 8, 9: Storyboard sharing and embedding Using your own Analytical Tools We want to ensure you never hit a ceiling on what you can achieve or limit the value you can extract from your data. If you wish to use your own tools to analyse or report on your data, we believe you should have the power to do so. We provide two distinct methods for doing this: Direct Connection to the One Model Data Warehouse. We can authorize specific power users to access the data warehouse directly and read/write all the raw and modeled tables in the warehouse. If you want to use Tableau or PowerBI in this way, you are free to do so. You can write your own queries with SQL or extract directly from the warehouse in your data science programs such as Python or R. The choice is yours. At this point, it is essentially your warehouse as if you created it yourself, we have just helped to orchestrate the data. Data Destinations. If you need to feed data to an enterprise data warehouse, data lake, or other data store, then our data destinations functionality can send the selected data out on a scheduled basis. This is often used to integrate HR data into an enterprise data strategy or to power an investment in Tableau Server or other where teams want the HR data in these systems but don't want to build and run the complex set of APIs and data orchestration steps described above. In both of these scenarios, you're consuming data from the data model we've painstakingly built, reaping the productivity benefits by saving your technical team from having to do the data modelling. This also addresses a perennial issue for HR where the IT data engineering teams are often too busy to devote time to understanding the HR systems sufficiently to deliver what is needed for analytics and reporting success. Feed data to external vendors Another use for the data destinations described above is to provide data to external vendors, or internal business teams with the data they need to deliver their services. Many of our customers now push data out to these vendors rather than have IT or consultants build custom integrations for the purpose. We, of course, will have the complete data view, so you can provide more data than you did in the past when just sourcing from the HRIS system alone. A good example of this is providing employee listening/survey tools with a comprehensive data feed allowing greater analysis of your survey results. Another use case we've also facilitated is supporting the migration between systems using our integrations and data models as the intermediate step to stage data for the new system while also supporting continuity of historical and new data. (Reference this other blog on the topic: https://www.onemodel.co/blog/using-people-analytics-to-support-system-migration-and-innovation-adoption) Scheduled Data Destinations Figure 10: Example of data destinations in One Model What About People Analytics Embedded? This solution from SF is great for what we call 'convenience analytics' where you can access simple numbers, low complexity analytics and operational list reports. These would provide basic data aggregation and simple rates at a point in time without any historical trending. In reality, this solution is transactional reporting with a fancier user interface. Critically, the solution falls down in its inability to provide the below items: Trending across time (an analytics must have) Limited data coverage from SF modules (no access to data from some core areas including learning and payroll) Challenges joining data together and complexity for users in building queries No ability to introduce and integrate external data sources No ability to create anything of true strategic value to your organization. What About SAP Analytics Cloud? SAC has shown some great promise in being able to directly access the data held in SF and start to link to some external source systems to create the data integrations you need for a solid people analytics practice. The reality, however, is the capability of the product is still severely limited and doesn't provide enough capacity to restructure the data and create the right level of linkages and transformations required to be considered analytics-ready. As it is today, the SAC application is little more than a basic visualization tool and I can't fathom why an organization would take this path rather than something like Tableau or PowerBI which are far more capable visualization products. SAP Analytics Cloud has not yet become the replacement for the Workforce Analytics (WFA) product as it was once positioned. The hardest parts of delivering a robust people analytics software has always been the ongoing maintenance and development of your organizational data. The SF WFA's service model provided this with an expert team on call (if you have the budget) to work with you. With SAC, they have not even come close to the existing WFA offering, let alone something better. The content packages haven't arrived with any depth and trying to build a comprehensive people analytics suite yourself in SAC is going to be a struggle, perhaps even more than building it on your own in a more generic platform. What About SuccessFactors Workforce Analytics? Obviously, our team spent a lot of time with SuccessFactors' WFA product even predating the SF acquisition. The WFA product was a market and intellectual pioneer in the people analytics field back in the day and many members of our team were there, helping hundreds of organizations on their earliest forays into people analytics. The WFA solution has aged and SF has made little to no product improvements over the last five years. It is, however, still the recommended solution for SF customers that want trending and other analytics features that are relatively basic at this point. Several years ago, we started One Model because the SF WFA product wasn't able to keep pace with how organizations were maturing in their people analytics needs and the tool was severely limiting their ability to work the way they needed to. It was a black box where a services team (my team) had to deliver any changes and present that data through the limited lens the product could provide, all for a fee of course. Organizations quickly outgrew and matured beyond these limitations to the point I felt compelled to tackle the problem in a different fashion. One Model has become the solution we always wanted to help our customers become successful and to grow and mature their people analytics capability with data from SAP SuccessFactors and other systems. We provide the integrations, the analytical content, the data science, the transparency, scalability, and configurability that our customers always wished we could provide with SF WFA. We built our business model to have no additional services cost, we keep all aspects of our data model open to the customer, and our speed and delivery experience means there's no limit to which modules or data sets you wish to integrate. The One Model Solution for SAP SuccessFactors Direct API Integration to SuccessFactors Unlimited data sources Daily data refresh frequency Unlimited users Purpose built data models for SAP and SF No additional services costs People analytics metrics catalogue Create your own metrics and analytics Curated storyboard library for SuccessFactors Operational reporting Embed and share storyboards HR's most advanced predictive modelling suite Access all areas with a transparent architecture Use your own tools e.g. Tableau, PowerBI, SAC Take a tour in the video below We are happy to discuss your SuccessFactors needs.
Read Article
Featured
7 min read
Nicholas Garbis
Our objective in this series is to offer our team’s expertise to the public during this crisis in the same way that Walmart delivers trailers of water to natural disasters. The “water” we have to share is our expertise in workforce strategy, HR processes, data orchestration, and people analytics. This is a follow-up to our first COVID-related blog post People Analytics for measuring the impact of Coronavirus (COVID-19), in which we laid out a set of critical “Level 1” questions that organizations should be able to answer during the onset of this pandemic. Later in this blog, you will find information regarding a very quick-and-dirty tool we have developed. Before we get there, however, let’s step back for a moment to assess the broader topography of today’s situation. We are in a massive global pandemic that has already demonstrated its exponential potential. Organizations will have a variety of people analytics to help them make data-driven decisions regarding their workforce and their overall operations. We have not experienced anything quite like this, so we should expect it will require great empathy and creativity, a willingness to lead, and the agility to test, fail, and learn. If we consider breaking this out into phases that organizations may pass through over the next several months, we can start to anticipate the shifting needs for people analytics. These phases will not come with clear markers, so the only way to know where you are is to step back at regular intervals to reevaluate the situation and reallocate your efforts. We have developed a quick tool for the current phase. The tool we assembled is focused around the first phase of the crisis -- where we are right now. It is built to answer some basic questions, acknowledging the unfortunate reality that our HR systems are unlikely to possess the information we need. As a result, a process will be needed to capture and consolidate the needed information. In these situations, it’s critical to seek out only the most critical data elements (ie, “KISS”). A simple dataset, updated daily, has the potential to provide business and HR leaders with the “situational awareness” they need to make decisions quickly based on facts. The “COVID-19 Workforce Tracking” tool that we developed is a free Excel-based tool that can be used by organizations of any size, including those with limited people analytics resources. It aims at answering most of the questions laid out in our previous blog. One example is a mid-size medical device firm without a people analytics team. They are already getting started with the tool, standing up a daily update process with ~4 representatives from various parts of the business making updates into a shared worksheet. One person then ensures the dashboard is validated and then it goes to the CHRO for review with business leaders. It’s worth restating that our objective here is to get something out to the maximum number of people in the shortest amount of time. Hence, this quick solution in Excel. Our team obviously has an ability to stand up these metrics and a series of new ones in our One Model platform where we have robust data handling and visualization, but moving data to our clouds will require information security reviews by most organizations which takes time. (We have opted to deliver the “water” now and come back with sandwiches real soon.) The first iteration of this tool (v1.0) can be accessed at the bottom of this blog. You will be asked for your email so we can communicate when any new versions are released. Here’s a view of the dashboard: And here is a view of the dataset. It is a series of HRIS data fields (not all shown here) combined with a collection of 8 data fields (in yellow) which should be updated daily: Below are a few notes on the tool: Manage sensitive data according to the applicable laws and your company policies. This is built in Excel so it can be used quickly by the widest range of users. The data collection template is intentionally simple to ensure it can be sustained as a daily process, ideally by a single point of contact. A slim set of ~8 questions, combined with some basic employee data will enable you to answers to nearly all of the critical questions. The basic employee data will be taken as a baseline/snapshot. Terminations will be tagged (not deleted) and hires would be added to the list. The dashboard will provide a set of key metrics with ability to filter the results several ways. You and your teams can and should modify it to meet your needs. Future enhancements being considered (your input on this is welcome!) Macro to import the real-time statistics from the global COVID-19 tracking data as managed by Johns Hopkins University. Access to cloud-based solution within One Model technology to enable benchmarking across companies, reduce version control issues, and enable advanced modeling and forecasting through our One AI machine learning/AI platform. Below is the link to the latest version. As updates are made, the latest version will always be posted within this blog. Current Version: 1.0 To download the tool, please click the link below. About One Model One Model delivers our customers with a people analytics infrastructure that provides all the tools necessary to directly connect to source technologies and deliver the reporting, analysis, and ultimately prediction of the workforce and it's behaviors. Use our leading out-of-the-box integrations, metrics, analytics, dashboards, and domain expert content, or create your own as you need to including the ability to use your own tooling like Tableau, Power BI, R, Python as you need. We provide a full platform for building, sustaining, and maturing a people analytics function delivering more structure information, measurement, and accountability from your team. Learn more at onemodel.co.
Read Article
Featured
5 min read
Tony Ashton
In the last One Model product update post I talked about our new user experience and hinted at some exciting new developments on the horizon. In this post I want to share some more information on those future designs. Thanks again to our customers for sharing their time and collaborating with us on our UX developments and everyone in One Model, but I want to make a special mention of the powerhouse behind One Model’s product designs - Nicole Li - a true UX unicorn! While the new user experience showcases Nicole’s incredible work, the new stuff is where things get super exciting. This content is being shared to provide an insight into future product developments planned by One Model. This should not be interpreted as a commitment to deliver any particular functionality or to any defined timeline and may be changed at any point by One Model at its discretion. Purchasing decisions should be made based on current product capability only. Having said this, we are super excited to share what we are working on and actively engage with you regarding product innovation and the future of people analytics. At the risk of overusing some classic cliches; startups run on pizza and business runs on PowerPointTM (well slides anyway). The slides phenomenon has been prevalent for the last couple of decades and when I ask almost any company how they share information with managers, executives, boards, or in general meetings the answer over 90% of the time is “slides”. Storyboards & Slides When we recently announced One Model’s new Storyboard capability we hinted at a broader vision and here is part of that vision starting to unfold. The new Storyboards will have two modes, one where you have a fairly traditional tile based layout and the other where you are in presentation mode. Online interactive use of One Mode is growing rapidly, but much of the content from One Model still ends up in a presentation at some point, so we want to reduce the effort to create and maintain this content. Storyboards are then acting as both a modern interactive storytelling dashboard and interactive slide based presentation without the need for any rework. This will save a massive amount of time and also pre-positions the content for the most common destination to meet the consumers where they are. So, how does this work? The Storyboard view lets you arrange tiles on a forever vertically scrolling space with control over layout, size etc. You can then flip to Slides view to get an auto-arranged presentation with one tile per slide and controls to optimize the display for presenting to a group of people. Within the Slides view you can manage layout, decide which tiles to show or hide from the presentation, combine slides together etc. To help you create a narrative for your presentation you can open the outline view and craft the flow of your story. Being able to present online from within the One Model platform is powerful and provides you the ability to interact with the data in real-time to really engage with your audience. And, “yes”, to the question you are starting to form in your mind… you will be able to export this to PowerPointTM to blend with other presentations you are creating offline :) Telling a Story Using Narrative Insights Having assembled a compelling set of data isn’t sufficient to drive action. You also need to engage your audience and the best way to do this is through storytelling. The next major feature to our Storyboards vision is the ability to add a narrative to any tile that describes what is going on in straight-forward business language. Initially this capability will include information from One Model’s metric library and your own narrative, but over time we will incorporate insights powered by the One Ai machine learning platform. Captions can be rearranged as elements within a tile, or a separate, linked, tile with controllable positioning and layout depending on how you want to arrange your storyboard. The Storyboards vision is incredibly exciting and customers we have engaged in the design thinking behind these innovations can’t wait to get their hands on this new capability. Neither can we! Stay tuned for more information as the roadmap unfolds. This article has been primarily concerned with the developing technology of Storyboards, but I also want to let you know that One Model has a vast library of content to help you tell the story of how people drive impact in your business. We’ll write some more on this soon, but reach out if you want to learn more about our metric catalogue and ever-growing library of topic storyboards. When combined with OneAi, our Machine Learning platform, you can generate automated insights, future forecasts and identify key risks to answer the most pressing business questions you have today.
Read Article
Featured
4 min read
Josh Lemoine
Software companies today aren't exactly selling the idea of "lovingly crafting you some software that's unique and meaningful to you". There's a lot more talk about best practices, consistency, and automation. It's cool for software capabilities to be generated by robots now. And that's cool when it comes to things like making predictions. One Model is a leader in that space with One AI. This post isn't about machine learning though. It's about modeling your company's people data . The people at One Model work with you to produce a people data model that best suits your company's needs. It's like having your own master brewer help guide you through the immense complexity that we see with people data. Why does One Model take this hands-on approach? Because the people employed at your company are unique and your company itself is unique. Organizations differ not only in structure and culture but also in the combinations of systems they use to find and manage their employees. When you consider all of this together, it's a lot of uniqueness. The uniqueness of your company and its employees is also your competitive advantage. Why then would you want the exact same thing as other companies when it comes to your people analytics? The core goal of One Model is to deliver YOUR organization's "one model" for people data. A Data Engineer builds your data model in One Model. The Data Engineer working with you will have actual conversations with you about your business rules and logic and translates that information into data transformation scripting. One Model does not perform this work manually because of technical limitations or an immature product. It's actually kind of the opposite. Affectionately known as "Pipeo", One Model's data modeling framework is a major factor in allowing One Model to scale while still using a hands-on approach. Advantages of Pipeo include the following: It's fast. Templates and the "One Model" standard models are used as the starting point. This gets your data live in One Model very quickly, allowing for validation and subsequent logic changes to begin early on in the implementation process. It's extremely flexible. Anything you can write in SQL can be achieved in Pipeo. This allows One Model to deliver things outside the realm of creating a standard data model. We've created a data orchestration and integrated development environment with all the flexibility of a solution you may have built internally. It's transparent. You the customer can look at your Pipeo. You can even modify your Pipeo if you're comfortable doing so. The logic does not reside in a black box. It facilitates accuracy. Press a validation button, get a list of errors. Correct, validate, and repeat. The scripting does not need to be run to highlight syntax issues. OMG is it efficient. What used to take us six weeks at our previous companies and roles we can deliver in a matter of hours. Content templates help but when you really need to push the boundaries being able to do so quickly and with expertise at hand lets you do more faster. It's fun to say Pipeo. You can even use it as a verb. Example: I pipeoed up a few new dimensions for you. The role the Data Engineer plays isn't a substitute for working with a dedicated Customer Success person from One Model. It's in addition to it. Customer Success plays a key role in the process as well. The Customer Success people at One Model bring many years of industry experience to the table and they know their way around people data. They play a heavy role in providing guidance and thought leadership as well as making sure everything you're looking for is delivered accurately. Customer Success will support you throughout your time with One Model, not just during implementation. If you'd like to sample some of the "craft people analytics" that One Model has on tap, please reach out for a demo. We'll pour you a pint right from the source, because canned just doesn't taste as good.
Read Article
Featured
11 min read
Chris Butler
About ten years ago, as the pace of HR technology migration to the cloud started to heat up, I started to have a lot more conversations with organizations that were struggling with the challenges of planning for system migration and what to do with the data from their old systems post-migration. This became such a common conversation that it formed part of the reason for One Model coming into existence. Indeed much of the initial thought noodling was around how to support the great cloud migration that was and still is underway. In fact, I don't think this migration is ever going to end as new innovation and technology becomes available in the HR space. The pace of adoption is increasing and more money is being made than ever by the large systems implementation firms (Accenture, Deloitte, Cognizant, Rizing etc). Even what may be considered as a small migration between two like systems can cost huge amounts of money and time to complete. One of the core challenges of people analytics has always been the breadth and complexity of the data set and how to manage and maintain this data over time. Do this well, though, and what you have is a complete view of the data across systems that is connected and evolving with your system landscape. Why then are we not thinking in a larger context about this data infrastructure to be able to support our organizations adoption of innovation? After all, we have a perfect data store, toolset, and view of our data to facilitate migration. The perfect people analytics infrastructure has implemented an HR Data Strategy that disconnects the concept of data ownership from the transactional system of choice. This has been an evolving conversation but my core view is that as organizations increase their analytical capability, they will have in place a data strategy that supports the ability to choose any transactional system to manage their operations. Being able to quickly move between systems and manage legacy data with new data is key to adopting innovation and organizations that do this best will reap the benefits. Let's take a look at a real example, but note that I am ignoring the soft skill components of how to tackle data structure mapping and the conversations required to identify business logic, etc., as this still needs human input in a larger system migration. Using People Analytics for System Migration Recently we were able to deploy our people analytics infrastructure with a customer to specifically support the migration of data from Taleo Business Edition to Workday's Recruiting module. While this isn't our core focus as a people analytics company, we recently completed one of the last functional pieces we needed to accomplish this, so I was excited to see what we could do. Keep in mind that the below steps and process we worked through would be the same from your own infrastructure but One Model has some additional data management features that grease the wheels. To support system migration we needed to be able to Extract from the source system (Taleo Business Edition) including irregular data (resume files) Understand the source and model to an intermediate common data model Validate all source data (metrics, quality, etc) Model the intermediate model to the destination target model Push to the destination (Workday) Extract from the destination and validate the data as correct or otherwise Infinitely and automatically repeat the above as the project requires. Business logic to transform and align data from the source to target can be undertaken at both steps 2 and 4 depending on the requirement for the transformation. Below is the high level view of the flow for this project. In more detail The Source There were 132 Tables from Taleo Business Edition that form the source data set extracted from the API plus a separate the collection of resume attachments retrieved via a python program. Luckily we already understood this source and had modeled them. Model and Transform We already had models for Taleo so the majority of effort here is in catering for the business logic to go from one system to another and any customer specific logic that needs to be built. This was our first time building towards a workday target schema so the bulk of time was spent here but this point to point model is now basically a template for re-use. The below shows some of the actual data model transformations taking place and the intermediate and output tables that are being created in the process. Validation and Data Quality Obviously, we need to view the data for completeness and quality. A few dashboards give us the views we need to do so. Analytics provides an ability to measure data and a window to drill through to validate that the numbers are accurate and as expected. If the system is still in use, filtering by time allows new data to be viewed or exported to provide incremental updates. Data Quality is further addressed looking for each of the data scenarios that need to handled, these include items like missing values, and consistency checks across fields Evaluate, Adjust, Repeat It should be immediately apparent if there are problems with the data by viewing the dashboards and scenario lists. If data needs to be corrected at the source you do so and run a new extraction. Logic or data fills can be catered for in the transformation/modelling layers including bulk updates to fill any gaps or correct erroneous scenarios. As an automated process, you are not re-doing these tasks with every run - the manual effort is made once and infinitely repeated. Load to the Target System It's easy enough to take a table created here and download it as a file for loading into the target system but ideally you want to automate this step and push to the system's load facilities. In this fashion you can automate the entire process and replace or add to the data set that is in your new system even while the legacy application is still functioning and building data. On the cutover day you run a final process and you're done. Validate the Target System Data Of course, you need to validate the new system is correctly loaded and functioning so round-tripping the data back to the people analytics system will give you that oversight and the same data quality elements can be run against the new system. From here you can merge your legacy and new data sets and provide a continuous timeline for your reporting and analytics across systems as if they were always one and the same. Level of Effort We spent around 16-20 hours of technical time (excluding some soft skills time) to run the entire process to completion which included Building the required logic, target to destination models for the first time Multiple changes to the destination requirements as the external implementation consultant changed their requirements Dozens of end to end runs as data changed at the source and the destination load was validated Building a python program to extract resume files from TBE, this is now a repeatable program in our augmentations library. That's not a lot of time, and we could now do the above much faster as the repeatable pieces are in place to move from Taleo Business Edition to Workday's Recruiting module. The same process can be followed for any system. The Outcome? "Colliers chose One Model as our data integration partner for the implementation of Workday Recruiting. They built out a tailored solution that would enable us to safely, securely and accurately transfer large files of complex data from our existing ATS to our new tool. They were highly flexible in their approach and very personable to deal with – accommodating a number of twists and turns in our project plan. I wouldn’t hesitate to engage them on future projects or to recommend them to other firms seeking a professional, yet friendly team of experts in data management." - Kerris Hougardy Adopting new Innovation We've used the same methods to power new vendors that customers have on-boarded. In short order, a comprehensive cross-system data set can be built and automatically pushed to the vendor enabling their service. Meanwhile the data from your old system is still held in the people analytics framework enabling you to merge the sets for historical reporting. If you can more easily adopt new technology and move between technologies you mitigate the risks and costs of 'vendor lock-in'. I like to think of this outcome as creating an insurance policy for bad fit technology. If you know you can stand up a new technology quickly, then you can use it while you need it and move to something that fits better in the future without loss of your data history then you will be more likely to be able to test and adopt new innovation. Being able to choose the right technology at the right time is crucial for advancing our use of technology and ideally creating greater impact for our organization and employees. Our Advice for Organizations Planning for an HR System Migration Get a handle and view across your data first -- if you are already reporting and delivering analytics on these systems you have a much better handle on the data and it's quality than if you didn't. The data is often not as bad as you expect it to be and cleaning up with repeatable logic is much better than infrequently extracting and running manual cleansing routines. You could save a huge amount of time in the migration process and use more internal resources to do what you are paying an external implementation consultant to deliver. Focus more time on the differences between the systems and what you need to cater for to align the data to the new system. A properly constructed people analytics infrastructure is a system agnostic HR Data Strategy and is able to deliver more than just insight to your people. We need to think about our people data differently and take ownership for it external to the transactional vendor, when we do so we realize a level of value, flexibility and ability to adopt innovation that will drive the next phase of people analytics results while supporting HR and the business in improving the employee experience. About One Model One Model delivers a comprehensive people analytics platform to business and HR leaders that integrates data from any HR technology solution with financial and operational data to deliver metrics, storyboard visuals, and advanced analytics through a proprietary AI and machine learning model builder. People data presents unique and complex challenges which the One Model platform simplifies to enable faster, better, evidence-based workforce decisions. Learn more at www.onemodel.co.
Read Article
Featured
4 min read
Stacia Damron
This summer, One Model opens new Data Center in Sydney, Australia. It's been a busy period for One Model, especially for our growing Australia office. If you can scroll past this gorgeous teaser photo without getting sidetracked and planning a vacation, we are going to provide some updates on what exactly the team has been up to. To begin with, the team has just opened a new state-of-the-art, enterprise grade infrastructure in its Sydney, Australia AWS hosted Data Center. The Australian infrastructure, which meets strict security standards, joins One Model’s fabric of existing infrastructure in the United States and Europe, all of which are designed to provide a local, robust, secure, and high-performance environment for its customers’ people and business data. This is our first data center in Australia. The data center opening comes shortly after the acquisition of our newest Melbourne-based, customer. Our newest customer, an Australian wagering and one of the world’s largest gaming companies, selected One Model as the company of choice for their people analytics platform in Q2 of 2019. Our team is thrilled to be a foundational element to their employee experience strategy and we plan to provide a number of key benefits including improved insight into our people, increased efficiency, and strategic value to key stakeholders. Our people analytics infrastructure's fast speed of deployment will help this new customer shift away from a reliance on legacy ways of working and technologies. “With an Australian founding team and a sizeable part of the One Model Engineering and Product Management teams being based in Brisbane, the team’s local knowledge and proximity represents a unique opportunity for customers in the Asia Pacific region. It allows One Model to be an active part of our global product innovation compared to traditional analytics software vendors.” says Tony Ashton, Chief Product Officer for One Model. These additional data centers play a crucial role in the company’s ability to better serve its current and future Australian and Asia Pacific region customers, as well as ensuring business continuity as the company continues to grow within the Australian market. Earlier this year AWS received PROTECTED and IRAP certification ensuring security compliance for working with the Australian Government and large enterprise. “The opening of this new data center is inline with One Model’s commitment to expand where our customers need us and to provide local infrastructure and personnel for data security and delivery of support services. An additional data center is already planned for delivery in Canada to support our Canadian customers in Q4 of 2019”, says Chris Butler, One Model CEO. One Model looks forward to welcoming additional internationally-based companies into it's family of customers as we continue to expand to serve these additional markets. In Australia? Want to meet the One Model team in person? Join us for the annual Australian HR Institute (AHRI) Convention in Brisbane this September 16-19th, where we'll be exhibiting at stand #64. The exhibition hall is open to visitors free of charge. Let us know if you plan to stop by! About One Model One Model delivers our customers with a people analytics infrastructure that provides all the tools necessary to directly connect to source technologies and deliver the reporting, analysis, and ultimately prediction of the workforce and it's behaviors. Use our leading out-of-the-box integrations, metrics, analytics, dashboards, and domain expert content, or create your own as you need to including the ability to use your own tooling like Tableau, Power BI, R, Python as you need. We provide a full platform for building, sustaining, and maturing a people analytics function delivering more structure information, measurement, and accountability from your team. Learn more at onemodel.co.
Read Article
Featured
8 min read
Stacia Damron
It’s a great time to be in management, right? According to a Harvard Business Review survey, we live in a world where trust is at an all time-low; 58 percent of respondents admitted to trusting strangers more than their own boss. Meanwhile, Uber’s giving an average of 5.5 million rides a day. (The average Uber driver rating is 4.8/5 stars, by the way.) 5.5 million people are trusting a complete stranger to get them the airport, but not their own managers. Workplace Trust Trust promotes confidence in the company’s future. A high level of trust encourages employees to work more effectively, engage with their work and peers, and allows them to be more productive overall. One could say it's both a cause and effect of a company's culture. Every day, we make decisions (consciously or unconsciously) based on the trust we have in each other. Each and every one of those decisions either encourages or discourages trust. So where did the workplace trust supposedly go? How should companies and managers work to build more than trust? How are we, as people analytics professionals, working to measure, track, and improve workplace satisfaction altogether? This article doesn't unlock a magical answer, but here are some good KPIs to keep on your radar: Absenteeism Rate Employees who are present, on-time, and hitting their goals and deadlines are going to be more engaged, satisfied employees. Those who aren’t…might not be singing the company's praises. Monitoring absenteeism and cross referencing with other KPIs is a good place to start. Employee Turnover Rate According to Office Vibe, only 12 percent of employees leave an organization for more money. On the other hand, 89 percent of bosses believe their employees quit because they want more money. Hmm. Is the company conducting exit surveys? Tracking why employees are leaving is vital, in addition to measuring additional metrics such as turnaround under specific managers, departments, or within specific minority groups. Is there a pattern in turnover? Perhaps a specific department, manager, or trigger event is responsible? Do you have predictive models that can help you internalize your data and answer the big questions? Employee Net Promoter Score (graph above) The infamous Net Promoter Score, which was originally a customer service tool, was later used internally on employees instead of customers. The Employee Net Promoter Score (eNPS) measures the likelihood of whether an employee would be willing to recommend your company as a great place to work, (get this - according to research - 59% of employees wouldn’t recommend theirs) and whether they would recommend the products or services your company creates. If you haven't yet started, track your eNPS. Then you can filter the data through a platform where you can see patterns and trends that could have affected the results. (Quick, shameless plug: you can measure the results and track and monitor changes to these in One Model’s people analytics platform to measure company-wide trust-related trends, and to view correlations with other key data and metrics.) Training When your car runs out of gas, do you fill up the tank, or leave it on the side of the road? Unless you’re from Dubai (and if you are, please send me the Maserati instead - we can work out the delivery instructions in the comments thread), then no, it’s not normal for people to do that. Same with employees. Training for a new employee can cost upwards of 20% of an employee’s annual salary. It’s better to engage your employees ahead of time than have to constantly rehire new ones. Employees who are actively choosing to participate in optional company-sponsored training and education programs (and allowed to pursue outside continued education) have been proven to be more invested in both their role and the company, feel more valued, and maintain a high level of loyalty and trust for their workplace. They have a higher likelihood of having a high eNPS score, and fuel company growth through positive word of mouth to their community (and network of prospective employees). The Summary For everyone out there that's not a rideshare driver, there's still hope. Yes, it takes extra time digging into the data, and yes, it requires a platform that can help you make sense of the KPIs you're tracking. But not all is lost. If you're digging into your workforce analytics data - have you considered building predictive models? They can shed light on things like the following: 1) Attrition Rates: Predict how many of your employees are going to leave within the next six or twelve months (based on maybe 30+ factors like manager turnover, whether or not they applied for jobs internally and were rejected, commute time, training attendance and participation, etc., etc., etc.). 2) Manager Toxicity Levels: Is there a lot of turnover under a particular department or manager? Is there high female turnover under a particular male executive? Shed light on what's going on. 3) Recruitment and Hiring: Are you recruitment strategies sound? Furthermore - are you hiring the right people for the job? Where are your best, high-performing sales representatives sourced from? Do you have data to backup your assumptions? One Model provides people analytics infrastructure - we provide a platform for you to import your workforce data and build predictive models such as the ones listed above (and so, so, so many more). Whether that means creating customized models or going with our out-of-the-box integrations - you get the whole shebang. We can take data from any source, clean and normalize it, and use it to create these models for you. Then, we provide a means to view your data in these models with nice, simple visualization tools. (Example: think, all three of your last (or future) HRIS systems - all that data - cleaned and normalized from ALL of those systems - living in one place, in clear visuals.) Want to add data from more sources and see how it affects that model? No problem. The awesome thing is that once a model is built with your data in One Model - you don't have to rework everything and start from scratch if you want to add another source. It can be added right on in. Painless. Maybe I'm biased because of all the cool initiatives I see our team's data scientists and engineers working on, but I have to admit - I'd give One Model a five star rating. That's more than I can say for some of my Ubers. If you'd like to talk to a team member, check us out. We won't force you into a demo; ask us whatever questions you'd like. About One Model One Model's people analytics software provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
3 min read
Chris Butler
Earlier this year i joined one of The Learning Forum's workforce analytics peer groups and i wanted to share my experience in attendence and why i came away thinking these groups are a great idea and should be considered by every PA practitioner. There are a number of groups that you can take a look at joining including Insight 222, and The Conference Board, but Brian Hackett from The Learning Forum had asked me earlier this year to come and present to their group about what we were doing at One Model. We had come up in their conversations and peer group emails where members had been asking about different technologies to help them in their building their HR analytics capabilities. The Learning Forum is a group of mostly Fortune 2000 companies with a sizable proportion being Fortune 500 organizations, of course i accepted. Our presentation went well and we had some great questions from the group around how we would tackle existing challenges today and where the platform is heading for their future projects. A great session for us but the real value i took away was in staying for the rest of the day to be a fly on the wall for how the group worked and what they shared with each other. Brian had tabled on the agenda some pre-scheduled discussions on what the attendees were interested in learning about and discussing with their peers. The agenda was attendee curated so all subjects were relevant to the audience and provided some structure and productivity to the event. Following was time for members to be able to present on any recent projects, and work they had been conducting in their teams and any valuable insights, outcomes, and advice they could share with the group. This was awesome to sit in on and listen to how others in our space are working, what their challenges are, how they fared, and to do so in an environment of open confidential sharing. It's the spirit of confidentiality and sharing between peers that i felt most made this group able to help and learn from each other that you just don't receive from a run of the mill conference. Practitioner's were here to share, to learn, and openly seek advice from their more experienced colleagues. Presentations ranged from experience using different vendors, to cobbled together projects using spit, glue, and anything else hands could be laid on. I found the cobbled together solutions to be the most innovative, even where a company of the practitioner's size has significant resources the insight's came from innovative thinking and making use of tools that every company has access to. It's these projects of working smart not hard that make me smile the most, and the best part is that it could be shared in a fashion, and truthfulness that couldn't have occurred at a conference, or a public linkedin post. Peer forums provide an educational opportunity that you won't get elsewhere, i highly recommend for all people analytics practitioners. Thanks Brian Hackett at The Learning Forum for letting me present and learn about how your members are learning from each other.
Read Article
Featured
6 min read
Stacia Damron
It’s sounds ridiculous, but it’s true. According to the New York Times, 4.2% of women held CEO roles in America’s 500 largest companies. Out of those same 500 companies, 4.5% of the CEO’s were named David.* While shocking, unfortunately, it’s not incredibly surprising. Especially when a whopping 41% of companies say they’re “too busy” to deploy diversity initiatives. But for every company out there that’s “too busy”, there are plenty of others fighting to get it right. Take Google, for example. In 2016, Google’s tech staff (specifically tech roles - not company-wide roles) was 1% Black, 2% Hispanic, and 17% women. They announced a plan to invest 150 million in workforce initiatives. The tech staff is now 2.5% Black and 3.5% Hispanic/Latinx, and 24.5% female, according to their 2018 diversity report. So what does that mean? It means that even the brightest and most innovative companies have their work cut out for them in regards to improving diversity. Change doesn’t happen overnight. Diversity breeds innovation; a diverse talent pool leads to diverse ideas. Get this; a Forbes article touts that transitioning a single-gender office to a team equally comprised of men and women would translate to 41% in additional revenue. “Metrics” (which is just a fancy word for data btw) don’t lie. It’s important to set, track, and monitor workforce diversity goals - especially when we have more tools than ever at our disposal to do so. Over the past few years, here at One Model, we've seen a huge push for placing a priority on monitoring diversity metrics. In 2016, a Fortune 100 financial services organization, Company X (name anonymized) selected One Model’s platform to measure and monitor company-wide trends in diversity data and metrics. As their people analytics and workforce planning solution, One Model allowed them to not only better report on their data - but also more easily track and monitor changes, determine key KPIs, and see how improvements they’re making internally are affecting the data. More Accurate Data = Better Reporting. During Company X's transition from SAP to Workday, they used One Model to retrieve and migrate survey data. This platform allowed them to combine and normalize the data from several sources, enabling the team to report off of it as one source. The successful migration provided the HR team with the recovered data and prevented the team from having to redeploy the survey, allowing them to more accurately reflect their current diversity metrics and progression towards goals. This was a win. Here’s the challenge: When pulled together, the data referenced above indicated that out of several thousand employee responses, a number of employees failed to select or identify with one of the given race selections. This represented a sizeable portion of the employees. One Model’s software helped them identify this number. Once they realized this, they realized they had an opportunity to setup other processes internally. They did just that - which helped identify 95% of the employees who fell within that group, obtaining vital missing data that raised the percentage of diversity within the organization. Determining Key KPIs and Measuring Improvements Furthermore, Company X used the One Model platform to identify and reward the departments that successfully hit their recruitment-based diversity goals. This allowed the team to survey these departments and identify the hiring trends and best practices that led to these improved diversity metrics. By identifying specific process and KPI’s surrounding these diversity metrics, departments that successfully met their goals could share recruiting tactics and best practices to ensure appropriate actions were taken to maximize diversity throughout the whole of the recruiting pipeline. Company X is currently implementing these processes and working towards replicating a similar outcome amongst other departments in need of workforce diversity improvement. Tracking and Monitoring Changes Last but not least, Company X wanted more visibility into why females had a lesser presence in managerial roles within the organization. While, male to female promotions were equal. (This past year, 32 people were promoted. 55% of promotions (16 people) were women), there were significantly more males than females in managerial roles. Upon reviewing the data, they learned that out of the company’s requisitions, females applicants only made it to certain stages within the interview process (namely, an in-person interview) 50% of the time. Half the time, the only applicants that made it to a particular stage were male. They determined a hypothesis surrounding a particular KPI - that if more females made it to this particular stage, the odds were higher that more females would fill these roles. Company X set a goal that they wanted a female candidate make it to a manager interview stage 80% of the time. They are testing different methods on how best to achieve this, and with One Model's help, they are able to measure the effectiveness of those methods. By providing this visibility, One Model’s platform is currently helping them monitor their progress towards this goal, and allows them to see the affect - the direct impact on numbers of M/F managers in real-time. Company X is one of the many companies that has realized and embraced the importance of diversity in workforce planning. We’re confident they’ll eventually hit their goals, and we’re proud to be a part of the solution helping them do so. Is your company ramping up it’s People Analytics Program or diving into workforce diversity initiatives? One Model can help you better view and report on the data associated with your diversity goals. Here are just a few of the top metrics companies are currently focusing on: Recruitment Metrics Representation Metrics, such as: Minorities / URMs Veterans Women IWDs Staffing/Placement Metrics Transaction Metrics Training Metrics, such as: Penetration of diversity-related training, general training participation rates, and demographics of talent pipeline Advancement Metrics External Diversity Metrics Culture / Workplace Climate Metrics *based on 2016 NYT data. Want to see what One Model can do for you? Scheduled some time to chat with a One Model team member. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
3 min read
Stacia Damron
Today, at The HR Technology Conference and Exposition, HRExaminer unveiled its 2019 Watchlist - "The Most Interesting AI Vendors in HR Technology." One Model is one of thirteen companies named, narrowed down from a list of over 200 intelligence tools, only 70 of which were invited to provide a demo. One Model was featured alongside several notable vendors including Google, IBM, Workday, and Kronos. The Criteria HRExaminer, an independent analyst of HRTechnology and intelligence tools, selected two winners across five distinct categories: AI as a Platform Data Workbench Microservices Embedded AI First Suite One Model was named as one of two featured companies in HRExaminer's Data Workbench Category and commended for its management of disparate data from disparate sources - specifically the platform's robust Analytics Integration. “Each of the companies on our 2019 Watchlist is demonstrating the best example of a unique value proposition. While we are in the early stages of the next wave of technology, they individually and collectively point the way," said John Sumser, HRExaminer’s founder and Principal Analyst. "Congratulations are in order for the work that they do. The award is simply a recognition of their excellence." Sumser goes on to state, “There are two main paths to analytics literacy and working processes in today’s market. The first is templated toolkits for specific purposes that can give employers a quick start and repeatable/benchmarkable processes. One Model represents the alternative: a complete set of tools for designing and building your own nuanced analytics, predictions and applications.” One Model is currently exhibiting at The Technology Conference and Exposition in Vegas, September 11th-13th. Attendees can visit booth #851 for more information. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
4 min read
Stacia Damron
One Model is keen on ensuring our customers have an exceptional experience interacting with both our software and team alike. That experience begins the moment we meet. Often, the moment that relationship begins is on our website. One Model's platform helps HR and People Analytics teams simplify the messiest of their workforce data, strewn over multiple systems. Our software makes life easier - and our website needs to reflect that simplicity. It needs to be straightforward, easy to navigate, and provide helpful resources and tools to help you continue to grow your people analytics functions. For months, we have been diligently working to create a site that betters your experience - a place that provides you with tools and resources to support you in your data-wrangling journey. Well, now it's official - at the end of Q2, we launched it! The new site has clearly defined solutions for companies looking to scale their people analytics capabilities at all levels - regardless of company size, including resources to get started for evolving teams, and strategies to leverage for more mature people analytics programs. Namely - our new website will more effectively serve those seeking more information regarding people analytics platforms and data warehousing solutions. One Model helps HR departments better support their people analytics team. The new website contains more materials, including white papers, customer testimonials, videos, and data-sheets. Our blog authors helpful tips, relevant articles, best practices, and useful insights for today's data-driven HR professionals and data scientists. The new website includes: Updated navigation better aligns customers with our offerings and core capabilities, reduces the number of user clicks to navigate the website, and directs users to relevant, meaningful content and solutions. List of integrations and partnerships enable users to easily identify integrations that can add value with their current software or platforms. Updated Blog enables users to quickly find applicable, informative content and industry news regarding workforce analytics, data warehouse management, data science techniques, and people analytics programs. More options to connect with the team via numerous information request forms. Additionally, they include more form variation, allowing users to submit requests for quotes, demos, or discussions. Supplementary materials to aid in decision making provide more materials to view, including white papers, customer testimonials, videos, and data-sheets. Career Opportunities showcase open roles and allow job-seekers to apply directly via that page. As our company continues to grow and expand within the US and UK markets, our new website will better represent One Model as we continue to set the bar for excellence in HR data warehouse management and people analytics team solutions. Visit onemodel.co for a comprehensive breakdown of our workforce data solutions. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own. Its newest tool, One AI, integrates cutting-edge machine learning capabilities into its current platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data.
Read Article
Featured
8 min read
Phil Schrader
Last week I was doodling some recruiting graphs in my notebook, with an eye toward building out some new recruiting efficiency dashboards. I was thinking about how requisitions age over time and I got an idea for a cool stacked graph that counts up how many requisitions you have open each month and breaks them out into age buckets. Maybe some supporting breakouts like recruiter, some summary metrics, etc. Something like this: Phil's Beautifully Hand-illustrated Cholesterol Graph (above) This would be an awesome view. At a glance I could see whether my total req load was growing and I could see if I’m starting to get a build up of really old reqs clogging the system. This last part is why I was thinking of calling it the Requisition Cholesterol Graph. (That said, my teammate Josh says he hates that name. There is a comment option below… back me up here!) But then I got to thinking, how am I actually going to build that? What would the data look like? Think about it: Given: I have my list of requisitions and I know the open date and close date for each of them. Problem #1: I want to calculate the number of open reqs I have at the end of each time period. Time periods might be years, quarters, months, or days. So I need some logic to figure out if the req is open during each of those time periods. If you’re an Excel ninja then you might start thinking about making a ton of columns and using some conditional formulas. Or… maybe you figure you can create some sort of pancake stacks of rows by dragging a clever formula down the sheet… Also if you are an Excel ninja… High Five! Being an Excel ninja is cool! But this would be pretty insane to do in Excel. And it would be really manual. You’d probably wind up with a static report based on quarters or something and the first person you show it to will ask if they can group it by months instead. #%^#!!! If you’re a full on Business Intelligence hotshot or python / R wiz, then you might work out some tricky joins to inflate the data set to include a record or a script count a value for each time the reqs open date is before or within a given period, etc. Do able. But then… Problem #2: Now you have your overall count of reqs open in each period. Alls you have to do now is group the requisitions by age and you’re… oh… shoot. The age grouping of the requisitions changes as time goes on! For example, let’s say you created a requisition on January 1, 2017. It’s still open. You should count the requisition in your open req count for January 2017 and you’d also count it in your open req count for June 2018 (because it’s still open). Figuring all that out was problem #1. But now you want to group your requisitions by age ranges. So back in January 2017, the req would count in your 0 - 3 months old grouping. Now it’s in your > 1 year grouping. The grouping changes dynamically over time. Ugh. This is another layer of logic to control for. Now you’re going to have a very wild Excel sheet or even more clever scripting logic. Or you’re just going to give up on the whole vision, calculate the average days open across all your reqs, and call it a day. $Time_Context is on my side (Gets a little technical) But I didn’t have to give up. It turns out that all this dynamic grouping stuff just gets handled in the One Model data structure and query logic -- thanks to a wonderful little parameter called $Time_Context (and no doubt a lot of elegant supporting programming by the engineering team). When I ran into $Time_Context while studying how we do Org Tenure I got pretty excited and ran over to Josh and yelled, “Is this what I think it is!?” (via Slack). He confirmed for me that yes, it was what I hoped it was. I already knew that the data model could handle Problem #1 using some conditional logic around effective and end dates. When you run a query across multiple time periods in One Model, the system can consider a date range and automatically tally up accurate end of period (or start of period) counts bases on those date ranges. If you have a requisition that was opened in January 2017 and you want to calculate the number of reqs you have open at the end of every month, One Model will cycle through the end of each month, check to see if the req was opened before then and is not yet closed, and add it to the totals. We use this for all sorts of stuff, particularly headcount calculations using effective dates and end dates. So problem one was no problem, but I expected this. What I didn’t expect and what made me Slack for joy was how easily I could also deal with Problem #2. Turns out I could build a data model and stick $Time_Context in the join to my age dimension. Then One Model would just handle the rest for me. If you’ve gotten involved in the database side of analytics before, then you’re probably acquainted with terms like fact and dimension tables. If you haven’t, just think vlookups in Excel. So, rather than doing a typical join or vlookup, One Model allows you to insert a time context parameter into the join. This basically means, “Hey One Model, when you calculate which age bucket to put this req in, imagine yourself back in time in whatever time context you are adding up at that moment. If you’re doing the math for January 2017, then figure out how old the req was back then, not how old is is now. When you get to February 2017, do the same thing.” And thus, Problem #2 becomes no problem. As the query goes along counting up your metric by time period, it looks up the relevant requisition age grouping and pulls in the correct value as of that particular moment in time. So, with our example above, it goes along and says, “Ok I’m imagining that it’s January 2017. I’ll count this requisition as being open in this period of time and I’ll group it under the 0 - 3 month old range.” Later it gets to June 2018 and it says, “Ok… dang that req is STILL open. I’ll include it in the counts for this month again and let’s see… ok it’s now over a year old.” This, my friends, is what computers are for! We use this trick all the time, particularly for organization and position tenure calculations. TL;DR In short, One Model can make the graph that I was dreaming of-- no problem. It just handles all the time complexity for me. Here’s the result in all it’s majestic, stacked column glory: So now at a glance I can tell if my overall requisition load is increasing. And I can see down at the bottom that I’m starting to develop some gunky buildup of old requisitions (orange). If I wanted to, I could also adjust the colors to make the bottom tiers look an ugly gunky brown like in the posters in your doctors office. Hmmm… maybe Josh has a point about the name... And because One Model can handle queries like this on the fly, I can explore these results in more detail without having to rework the data. I can filter or break the data out to see which recruiters or departments have the worst recruiting cholesterol. I can drill in and see which particular reqs are stuck in the system. And, if you hung on for this whole read, then you are awesome too. Kick back and enjoy some Rolling Stones: https://www.youtube.com/watch?v=wbMWdIjArg0.
Read Article
Featured
6 min read
Chris Butler
A few weeks ago I gave a presentation at the Talent Strategy Institute’s Future of Work conference (now PAFOW) in San Francisco about how I see the long term relationship between data and HR Technology. Essentially, I was talking through my thought process and development that I could no longer ignore and had to go start a company to chase down it’s long term vision. So here it is. My conviction is that we need to (and we will) look at the relationship between our data and our technology differently. That essentially the two will be split. We will choose technology to manage our data and our workflows as we need it. We will replace that technology as often as our strategy and our business needs change. Those that know my team, know that we have a long history of working with HR data. We started at Infohrm many years ago which was ultimately acquired by SuccessFactors and shortly after SAP. Professionally this was fantastic, worlds opened up and we were talking to many more organizations and the challenges they were facing across their technology landscape. How to achieve data portability. Over time I was thinking through the challenges our customers faced, a large one of which was how to help grease the wheels for the huge on-premise to cloud transition that was underway and subsequently the individual system migrations we were witnessing across the HR landscape. The pace of innovation in HR was not slowing down. Over the years hundreds of new companies were appearing (and disappearing) in the HR Tech space. It was clear that innovation was everywhere and many companies would love to be able to adopt or at least try out this innovation but couldn’t. They were being hampered by political, budgetary, and other technology landscape changes that made any change a huge undertaking. System migration was on the rise. As companies adopted the larger technology suites, they realized that modules were not performing as they should, and there were still gaps in functionality that they had to fill elsewhere. The promise of the suite was letting them down and continues to let them down to this day. This failure, combined with the pace of innovation meant the landscape was under continuous flux. Fragmentation was stifling innovation and analytical maturity. The big reason to move to a suite was to eliminate fragmentation, but even within the suites the modules themselves were fragmented and we as analytics practitioners without a method for managing this change only continued to add to this. We could adopt new innovation but we couldn’t make full use of it across our landscape. Ultimately this slows down how fast we can adopt innovation and downstream how we improve our analytical maturity. All HR Technology is temporary. The realization I started to come to is that all of the technology we were implementing and spending millions of dollars on was ultimately temporary. That we would continue to be in a cycle of change to facilitate our changing workflows and make use of new innovation to support our businesses. This is important so let me state it again. All HR technology is temporary. We’re missing a true HR data strategy. The mistake we were making is thinking about our technologies and our workflows as being our strategy for data management. This was the problem. If we as organizations could put in place a strategy and a framework that allowed us to disconnect our data from our managing technology and planned for obsolescence then we could achieve data portability. We need to understand the data at its fundamental concepts. If we know enough to understand the current technology and we know enough about the future technology then we can create a pathway between the two. We can facilitate and grease the migration of systems. In order to do this effectively and at scale you had to develop an intermediate context of the data. This becomes the thoroughfare. This is too advanced a concept for organizations to wrap their minds around. This is a powerful concept in essence and seems obvious, but trying to find customers for this was going to be near impossible. We would have to find companies in the short window of evaluating a system change to convince them they needed to look at the problem differently. Analytics is a natural extension. With the intermediate thoroughfare and context of each of these systems you have a perfect structure for delivering analytics from the data and powering downstream use cases. We could deliver data to vendors that needed it to supply a service to the organization. We could return data from these services and integrate into data strategy. We could write this data back to those core source systems. We could extend the data outside of these systems from sources that an organization typically could not access and make use of on their own. Wrap all this up in the burgeoning advanced analytics and machine learning capabilities and you had a truly powerful platform. We regain choice in the technology we use. In this vision, data is effectively separate from our technology and we regain the initiative back from our vendors in who and how we choose to manage our data. An insurance policy for technology. With freedom to move and to adopt new innovation we effectively buy ourselves an insurance policy in how we purchase and make use of products. We can test; we can prove; we can make the most of the best of breed and innovation that has been growing in our space. If we don’t like we can turn it off or migrate-- without losing any data history and minimizing switching costs. This is a long term view of how our relationship to data and our vendors will change. It is going to take time for this view to become mainstream, but it will. The efficiencies and pace that it provides to change the direction of our operations will deliver huge gains in how we work with our people and our supporting vendors. There’s still challenges to making this happen. Vendors young and old need to provide open access to your data (after all it’s your data). The situation is improving but there’s still some laggards. The innovative customers at One Model bought us for our data and analytical capabilities today, but they know and recognize that we’re building them a platform for their future. We’ve been working with system integrators and HR transformation groups to deliver on the above promise. The pieces are here, they’re being deployed, now we need to make the most of them.
Read Article
Featured
9 min read
Phil Schrader
We’re back with another installment of our One Model Difference series. On the heels of our One AI announcement, how could we not take this opportunity to highlight it as a One Model difference maker? In preparation for the One AI launch, I caught up with Taylor from our data science team and got an updated tour of how it all works. I’m going to try to do that justice here. The best analogy I can think of is that this thing is like a steam engine for data science. It takes many tedious, manual steps and let’s the machine do the work instead. It's not wizardry. It's not a black box system where you have to point at the results, shrug, and say, “It’s magic.” This transparent approach is a difference in its own right, and I’ll cover that in a future installment. For now though, describing it as some form of data wizardry simply would not do it justice. I think it’s more exciting to see it as a giant, ambitious piece of industrial data machinery. Let me explain. You know the story of John Henry, right? John Henry is an African-American folk hero who, according to legend, challenged a steam-powered hammer in a race to drill holes to make a railroad tunnel. It’s a romantic, heart-breaking story. Literally. It ends with John Henry’s heart exploding from the effort of trying to keep pace. If you need a quick refresher, Bruce Springsteen can fill you in here. (Pause while you use this excuse to listen to an amazing Bruce Springsteen song at work.) Data science is quite a bit easier than swinging a 30 pound hammer all day, but I think the comparison is worthwhile. Quite simply, you will not be able to keep pace with One AI. Your heart won’t explode, but you’ll be buried under an exponentially growing number of possibilities to try out. This is particularly true with people data. The best answer is hiding somewhere in a giant space defined by the data you feed into the model multiplied by the number of techniques you might try out multiplied by (this is the sneaky one) the number of different ways you might prepare your data. Oh, and that’s just to predict one target. There’s lots of targets you might want to predict in HR! So you wind up with something like tedious work to the fourth power and you simply should not do it all by hand. All data science is tedious. The first factor, deciding what data to feed in, is something we’re all familiar with from stats class. Maybe you’ve been assigned a regression problem and you need to figure out which factors to include. You know that a smaller number of factors will probably lead to a more robust model, and you need to tinker with them to get the ones that give you the most bang for your buck. This is a pretty well known problem, and most statistical software will help you with this. This phase might be a little extra tricky to manage over time in your people analytics program, because you’ll likely bring in new data sets and have to retest the new combinations of factors. Still, this is doable. Hammer away. Of course, One AI will also cycle through all your dimensional data for you. Automatically. And if you add factors to the data set, it will consider those factors too. But what if you didn’t already know what technique to use? Maybe you are trying to predict which employees will leave the company. This is a classification problem. Data science is a rapidly evolving field. There are LOTS of ways to try to classify things. Maybe you decide to try a random forest. Maybe you decide to try neural nets using Tensorflow. Now you’re going to start to lose ground fast. For each technique you want to try out, you’ve got to cycle through all the different data you might select for that model and evaluate the performance. And you might start cycling through different time frames. Does this model predict attrition using one year of data but becomes less accurate with two years…? And so on. Meanwhile, One AI will automatically test different types of models and techniques, over different time periods, while trying out different combinations of variables and evaluating the outcomes. In comparison, you’ll start to fall behind pretty rapidly. But there’s more... Now things get kind of meta. HR data can be really problematic for data science. There is a bunch of manual work you need to do to prepare any data set to yield results. This is the standard stuff like weeding out bad columns, weeding out biased predictors, and trying to reduce the dimensionality of your variables. But this is HR DATA. The data sets are tiny and lopsided even after you clean them up. So you might have to start tinkering with them to get them into a form that will work well with techniques like random forests, neural nets, etc. If you’re savvy, you might try doing some adaptive synthetic sampling (making smaller companies appear larger) or principal component analysis. (I’m not savvy, I’m just typing what Taylor said.) So now you’re cycling through different ways of preparing the data, to feed into different types of models, to test out different combinations of predictors. You’ve got tedious work to the third power now. Meanwhile, One AI systematically hunts through these possibilities as well. Synthetic sampling was a dead end. No problem. On to the next technique and on through all the combinations to test that follow. This is not brute force per se-- that actually would introduce new problems around overfitting. The model generation and testing can actually be organized to explore problem spaces in an intelligent way. But from a human vs. machine perspective, yeah, this thing has more horsepower than you do. And it will keep working the models over, month after month. This is steam powered data science. Not magic. Just mechanical beauty. And now that we have this machine for HR machine learning. We can point that three-phase cycle at different outcomes that we want to predict. Want to predict terminations? Of course you do. That’s what everyone wants to predict. But what if in the future you want to predict quality of hire based upon a set of pre-hire characteristics. One AI will hunt though different ways to stage that data, through different predictive techniques for each of those potential data sets, and through different combinations of predictors to feed into each of those models…and so on and so on. You can’t replicate this with human powered data science alone. And you shouldn’t want to. There’s no reason to try to prove a John Henry point here. Rather than tediously cycling through models, your data science team can think about new data to feed into the machine, can help interpret the results and how they might be applied, or can devise their own, wild one-off models to try because they won’t have to worry about exhaustively searching through every other option. This might turn out similar to human-computer partnership in chess. (https://www.bloomreach.com/en/blog/2014/12/centaur-chess-brings-best-humans-machines.html) One AI certainly supports this blended, cooperative approach. Each part of the prediction pipeline can be separated and used on its own. Depending on where you are at in your own data science program, you might take advantage of different One AI components. If you just want your data cleaned, we can give you that. Or, if you already have the data set up the way you want it, we can save you time by running a set of state of the art classifiers on it, etc. The goal is to have the cleaning/preprocessing/upsamping/training/etc pieces all broken out so you can use them individually or in concert. In this way, One AI can deliver value whatever the size and complexity of your data science team, as opposed to an all-or-nothing scenario. In that regard, our human vs. machine comparison starts to break down. One AI is here to work with you. Imagine what John Henry could have done if they’d just given him the keys to the steam engine? Book some time on Phil's calendar below to get your HR data-related questions answered. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own. Our newest tool, One AI, integrates cutting-edge machine learning capabilities into its current platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data. Notable customers include Squarespace, PureStorage, HomeAway, and Sleep Number.
Read Article
Featured
5 min read
Stacia Damron
How did Spring cleaning become a thing, and why do we do it? It’s officially March. Daylight savings has us up an hour earlier, the weather’s teasing us by thinking about getting warmer, and most of us are envious of the students enjoying spring break on a beach somewhere. Supposedly, this odd combination of things gets us in the mood to clean house. But there’s research to back it up: according to the experts, the warm weather and extra light are responsible for giving us the additional boost of energy. What is it about cleaning that gets us so excited? Is it the fresh smell of mopped floors? Is is the sigh of relief when you can actually park your car in the garage instead of using it for storage? Or is it the look of shock on your significant other’s face when they realize their 10-year-old socks (the ones with the huge holes) are gone for good? It's kind of weird. Now, before we get too far in - I hope you didn’t get really excited about reading some “spot-free window cleaning tips” or “how to declutter your closet in 12 easy steps.” After all, 1) this is a software blog, and 2) I haven’t mastered either of those things. Spring cleaning is a way to refresh and reset. It feels GOOD to declutter. This is the premise here. Most people associate Spring cleaning with their home - but what if we went into Spring with that same excitement at work as well? What if we wanted to share that same, cathartic feeling with our teams and coworkers? You can! One Model can help you Spring clean your people analytics data and provide your team with access to more insights within your current workforce analytics data. We’re the experts at pulling data from as many as 40 or so sources. We can place it on a single platform (that will automatically refresh and update), allowing your team can see how it all interacts together - in one place. Say goodbye to the days of exporting data and poking around with Vlookups in excel, only to have to manually create the same report over and over again. Using the One Model platform to manage your HR data is akin to having someone come in and untangle 200 feet of Christmas lights (but instead of lights, it’s untangling data from your workforce analytics systems). And when you use our platform, you won't have to untangle it again. How awesome is that? A work-related spring cleaning is even more satisfying than a spring cleaning at home. Honestly, it is. You’re not going to get a promotion from organizing your cookware cabinet. However, at work, you might be considered for one if you detangle your data and save your team hours of their valuable time and resources on preparing data for analyzation. So, if you suddenly get the itch to clean something - I urge you and your HR team to commit to participating in a workforce data spring cleaning. Call it a day, and contact One Model to sort out your data organization problem for you. Same satisfaction, less scrubbing - I promise. Then, go home and turn your Roomba on, knowing you just conquered spring cleaning on both frontiers. Book a demo. Or just book some time to get your HR data-related questions answered. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own. Our newest tool, One AI, integrates cutting-edge machine learning capabilities into its current platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data. Notable customers include Squarespace, PureStorage, HomeAway, and Sleep Number.
Read Article
Featured
6 min read
Phil Schrader
There will be over 400 HR product and service providers in the expo hall at HR Tech in September. A typical company makes use of 8 - 11 of these tools, some as many as 30. And that is wonderful. I love working in HR Technology. Companies are increasingly free to mix and match different solutions to deliver the employee experience that is right for them. New products come to market all the time. And the entrepreneurs behind these products are pretty consistently driven by a desire to make work better for employees. All that innovation leads to data fragmentation. Better for employees that don't work in HR Operations and People Analytics, that is. Because all that innovation leads to data fragmentation. In your organization, you might recruit candidates using SmartRecruiters in some countries and iCIMS in others. You might do candidate assessments in Criteria Corp and Weirdly. Those candidates might get hired into Workday, have their performance reviews in Reflektive and share their own feedback through Glint surveys. This would not be in the least bit surprising. And it also wouldn't be surprising if your internal systems landscape changed significantly within the next 12 months. The pace of innovation in this space is not slowing down. And the all-in-one suite vendors can’t keep pace with 400 best of breed tools. So if you want to adopt new technology and benefit from all this innovation, you will have to deal with data fragmentation. How do you adopt new innovation without losing your history? What if the new technology isn’t a fit? Can you try something else without having a gaping hole in your analytics and reporting? How will you align your data to figure out if the system is even working? This is where One Model fits in to the mix. We're going to call this One Model Difference your Data Insurance Policy. One Model pulls together all the data from your HR systems and related tools, then organizes and connects this data as if it all came from a single source. This means you can transition between technology products without losing your data. This empowers you to choose which technology fits your business without suffering a data or transition penalty. I remember chatting about this with Chris back at HR Tech last year. At the time I was working at SmartRecruiters and I remember thinking... Here we are, all these vendors making our pitches and talking about all the great results you're going to get if you go with our product. And here's Chris literally standing in the middle of it all with One Model. And if you sign up with One Model, you'll be able to validate all these results for yourself because you can look across systems. For example, you could look at your time to hire for the last 5 years and see if it changed after you implemented a new ATS. If you switched out your HRIS, you could still look backwards in time from new system to old and get a single view of your HR performance. You could line up results from different survey vendors. You'd literally have "one model," and your choice of technology on top of that would be optional. That's a powerful thought. A few months later, here I am getting settled in at One Model. I'm getting behind the scenes, seeing how how all this really comes together. And yeah, it looks just as good from the inside as it did from the outside. I've known Chris for a while, so it's not like I was worried he was BS-ing me. But, given all the new vendors competing for your attention, you'd be nuts if you haven't become a little skeptical about claims like data-insurance-policy-that-makes-it-so-you-can-transition-between-products-without-losing-your-data. So here are a couple practical reasons to believe, beyond the whole cleaning up and aligning your data stuff we covered previously. First off, One Model is... are you ready... single tenant. Your data lives in its own separate database from everyone else's data. It's your data. If you want to have direct database access into the data warehouse that we've built for you, you can have it. Heck, if you want to host One Model in your own instance of AWS, you can do that. We're not taking your data and sticking it into some rigid multi-tenant setup at arms length from you. That would not be data insurance. That would be data hostage-taking. Second, One Model doesn't charge per data source. That would be like one of those insurance policies where everything is out-of-network. With One Model, your systems are in-network. If you add a new system and you want the data in One Model, we'll add the data to One Model. If we don't have a connector, we'll build one. One of our clients has data from 40 systems in One Model. 40 systems. In one single model. In its own database. With no fees per data source. So go wild at HR Tech this fall. It is in Vegas after all. Add all the solutions that are right for your employees. And tell all your new vendors you'll be able to hold them accountable for all those bold ROI-supporting metrics they’re claiming. Because you can put all your data into One Model for all your people analytics. You can see for yourself. And if you swap that vendor out later, you’ll take all your data with you. Just don't wait until then to reach out to us at One Model. We love talking shop. And if you happen to like what you see with One Model, we can have your data loaded well before you get to Vegas. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
3 min read
Stacia Damron
The One Model team is pleased to announce its official launch of One AI. The new tool integrates cutting-edge machine learning capabilities into the current One Model platform, equipping HR professionals with readily-accessible, unparalleled insights from their people analytics data. One Model’s core platform enables its customers to import multiple data sources into one, extensible, cloud-based platform. Organizations are then able to take full control of their people and business data, gaining increased visibility and spotting trends in the data that otherwise, would remain unnoticed. Machine Learning Insights like HR Professionals Have Never Seen Before One AI delivers a suite of out-of-the-box predictive models and data extensions, allowing organizations to understand and predict employee behavior like never before. One AI extends upon the current One Model platform capabilities, so now HR Professionals can access machine learning insights alongside their current people analytics data and dashboards. Additionally, the solution is open to allow customers and their partners to create and run their own predictive models or code within the One Model platform, enabling true support for an internal data science function. “One AI is a huge leap into the future of workforce analytics,” says Chris Butler, CEO of One Model. “By applying One Model's full understanding of HR data, our machine learning algorithms can learn from all of a customer’s data and predict on any target that our customers select.” The new tool offers faster insights: it can create a turnover risk predictive model in minutes, consuming data from across the organization, cleaned, structured, and tested through dozens of ML models and thousands of hyperparameters. It utilizes these to create a unique, accurate model that can provide explanations and identify levers for reducing an individual employees risk of turnover. This ability to explain and identify change levers is a cutting-edge capability. It allows One AI to choose a high accuracy model that’s otherwise unintelligible and explain it’s choices to our users. “The launch of One AI will have a huge impact on current and future customers alike.” says Stacia Damron, One Model’s Senior Marketing Manager. “One AI’s ability to successfully incorporate machine learning insights into an organization’s people analytics strategy is significant. It means it’s possible to quickly and automatically produce models that can analyze bigger, more complex data and deliver faster, more accurate results. By creating more precise models, and augmenting internal capabilities, an organization can better identify cost-saving opportunities and mitigate risk.” The One Model team looks forward to sharing more information about One AI with this year’s People Analytics World Conference attendees in London on April 11-12. Stop by the One Model booth if you would like to connect and learn more. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
7 min read
Phil Schrader
People often ask us, "What makes One Model different?" Well...there's a lot we could show and tell. We've decided to respond with a series of blog posts covering each and every reason. Read on for more! You can't do this with Tableau. One Model features the most advanced role-based security system of any analytics application. It has to. People data is often the most complex and most sensitive data in an organization. Through 15 years of experience working with People Analytics teams and knowing how they wish to provide access, we built a security methodology that caters for all scenarios and fills the complex gaps that other vendors ignore. One Model allows administrators to define custom security groups and designate fine-grained application permissions to each: Can these users create dashboards or just view them? Can they even filter dashboards? Can they drill down to detail? Can they use the Explore tool? Can they build their own metrics? Can they access a metric in one population but not access it in another without changing roles? At the data layer, One Model group permissions control access at both the column level (which data elements a user can see) and the row level (which records of the data element a user can access): Can they see a given department’s data? Do they have access to compensation metrics? Can they cut those metrics by Grade or Position? If they drill in to that data, can they see person-level detail? Better still, One Model security roles understand each user’s relationship to the data. Who reports to whom, for example. That means you could grant all the leaders in your organization drill-down access to their own team members with a single contextual data rule that follows them through the organization as their data changes. Done. Zero maintenance required. Multiple roles merging together to provide the exact level of access for each user regardless of whether they're a HRBP, executive, or director with complex reporting lines. This is not something you can achieve with tableau, qlik, or any other vendor in our space. They come close but they don't understand the relationship between a user and the data itself, which results in constant role security maintenance -- if the desired access can be achieved at all. Why it matters Most teams have self-service is part or goal of their People Analytics Roadmap. If you want to deliver self-service with HR data, you’ll need to effectively and sustainably manage fine-grained sets of permissions like the ones described above. Here’s a look at what is possible with the right role based security capabilities. Let’s say that you’ve developed an integrated recruiting effectiveness dashboard. Your business leaders, recruiting managers, and HRBPs all have access to this dashboard. Based on aggregate data, your business leader can see that the new candidate assessment is, in fact, doing a great job predicting high performing new hires across the company. She drills into her own team’s details and scans through a few examples. This builds her confidence both in the assessment tool and in the dashboard. She’s likely to come back and use other dashboards in the future. The recruiting manager, looking at the same dashboard, is excited by the overall results, but wants to see if this assessment result is having a negative impact on protected groups of candidates in the hiring process. Given her role, you’ve given her access to aggregate slices of demographic data. She uses dashboard filters to cut the data by gender, age, and ethnicity without having to request a one-off ad-hoc report. She’s ready when the topic comes up in a meeting later that day. She thanks you the next time you see her. The division’s HRBP has similar ideas but her security clearance is more complex. Because her division is split across countries and, due to local laws in one country, she's not allowed to view performance ratings, or conduct age and gender analyses, which are seamlessly unavailable for this population. With this limitation in place, she wants to take things a step further in the One Model Explore tool and analyse a recent change to recruiting practices. She combines assessment results and termination data along with her most recent employee survey results. The results are so interesting that she reaches out to you. “Hey, my termination rates are down. We think we’re making better hires based on this new assessment tool, and employee satisfaction is up as well. These are all good signs, but can you figure out which results are driving the others?” After a cursory analysis, the next step is to prove there is a correlation and quantify its impact with the built-in OneAI machine learning suite. Awesome. Isn’t this scenario why your company funded the program in the first place? Without advanced role-based permissions? Well, you probably know that story already. It starts with a generic, one-size fits all dashboard. The plot thickens with the arrival of ad hoc reporting requests and vlookups. And the story ends with… well… more ad hoc reporting and vlookups. If this is something that excites you, let's talk. Click the button below to schedule some time with a One Model team member. We can answer any specific questions you may have, or just chat about role-based permissions (if that's what you're into). About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
4 min read
Stacia Damron
Find our team in a city near you, and stop by in person to learn more about our workforce analytics solutions. February 9, 2018 - Austin, TX - The One Model team recently returned from the People Analytics and Future of Work (PAFOW) in San Francisco, where we participated as a key sponsor and speaker. There, our CEO, Chris Butler, was invited to announce a preview of our latest feature: One AI. (Above) One Model CEO, Chris Butler, announces One Model's newest tool: One AI, at PAFOW in San Francisco. One AI is a huge leap into the future of workforce analytics. Finally - there's a tool that makes machine learning readily accessible to HR professionals . By applying One Model's full understanding of HR data, our machine learning algorithms can draw a parallel, predicting any target that our customers select. For example, this means a turnover risk predictive model can be created in minutes; consuming data from across the organization, cleaned, structured, and tested through dozens of ML models and thousands of hyperparameters to select a unique, accurate model that can provide explanations and identify levers for reducing an individual employees risk of turnover. Our Next Stop: London The One Model team will be showcasing One AI at the People Analytics World Conference in London this April. We invite HR professionals, people analytics experts, and partners to join. Come find the One Model team and learn more about our workforce analytics software for HR professionals and data scientists. If you'd like an opportunity to meet the team in person and learn more, we'll be attending the following events later this year: People Analytics Conference - London, England - April 11-12, 2018 HR Technology Conference and Expo - Vegas, NV - September 11-13th, 2018 More events, TBD. “As One Model continues to expand our client base in the U.S. and abroad, we’re looking forward to participating in more international HR, data science, and AI events,” says One Model’s Senior Marketing Manager, Stacia Damron. “Both domestic and international trade shows have helped us showcase our workforce analytics solution to a broader, more diverse audience, and they offer us an opportunity to foster and maintain valuable relationships with clients and partners alike." About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article
Featured
13 min read
Chris Butler
I recently made a simple post on LinkedIn which received a crazy amount of views and overwhelmed us with requests to take a look at what we had built. The simple release was that we had managed to take Workday's point-in-time (snapshot)-based reporting and rebuild a data schema that is effective dated, and transactional in nature. The vast majority of organizations and People Analytics vendors use snapshots for extracting data from Workday because this is really the only choice they've been given to access the data. We don't like snapshots for several reasons They are inaccurate - you will typically miss out on the changes occurring between snapshots, this makes it impossible to track data/attribute changes in between, to pro-rate, and create analysis any deeper than the snapshot's time context. They are inflexible - an object or time context has already been applied to the data which you can't change without replacing the entire data set with a new context. They don't allow for changes - If data is corrected or changed in history you need to replace the entire data set, urggh. External data is difficult to connect - without effective dating joining in any external data means you have to assume a connection point and apply that time context's values to the external data set. This compounds the inaccuracy problem if you end up having to snapshot the external data as well. A pain in the #$% - To pull snapshots from Workday now you need to create the report for each snapshot period that you need to provide. Three years of data with a month-end snapshot, that's 36 reports to build and maintain. With our background in working with raw data directly from HR systems, this approach wasn't going to cut the mustard and couldn't deliver the accuracy that should be the basis of an HR data strategy. The solution is not to buy Workday's big data tools because you're going to be living with many of the same challenges. You need to take the existing structure, enhance, and fundamentally reconstruct a data architecture that solves these problems. We do just that, we extract all employee and object data, analyze the data as it flows and generate additional requests to the Workday API that work through the history of each object. Data is materialized into a schema close to the original but has additional effective-dated transactional records that you just wouldn't see in a snapshot-based schema. This becomes our raw data input into One Model, delivered to your own warehouses to be used any way you wish. The resulting dataset is perfect for delivering accurate, flexible reporting and analytics. The final structure is actually closer to what you would see with a traditional relational schema used by the HRIS sold by SAP, Oracle, PeopleSoft etc. Say what you will about the interfaces of these systems but, for the most part, the way they manage data is better suited for reporting and analytics. Now don't get me wrong, this is one area most people know Workday lags in, and in my opinion it should be a low priority decision point when selecting an HRIS. Don't compromise the value of a good transactional fit of an HRIS for your business in an attempt to solve for the reporting and analytics capability because ultimately you will be disappointed. Choose the HRIS system that fits how your business operates, solving for the reporting and analytics needs in another solution as needed. Time to get a little more technical. What I'm going to discuss below is the original availability format of data in comparison to the approach we take at One Model. Object-Oriented - The Why of the Snapshot Okay, so we all know that Workday employs an Object-Oriented approach to storing data, which is impressively effective for its transactional use case. It's also quite good at being able to store the historical states of the object. You can see what I mean by taking a look at the API references below: The above means the history itself is there but the native format for access is a snapshot at a specific point in time. We need to find a way of accessing this history and making the data useful for more advanced reporting and analytics. Time Context In providing a point in time, we are applying a time context to the data at the point of extraction. This context is then static and will never change unless you replace the data set with a different time context. Snapshot extractions are simply a collection of records with a time context applied. Often when extracting for analytics, companies will take a snapshot at the end of each month for each person or object. We get a result set similar to the below: The above is a simple approach but will miss out on the changes that occur between snapshot, because they're effectively hidden and ignored. When connecting external data sets that are properly effective- dated, you will need to make a decision on which snapshot is accurate to report against, but you simply don't have enough information available to make this connection correct. This snapshot is an inaccurate representation of what is really occurring in the data set, and it's terrible for pro-rating calculations to departments or cost centers and even something as basic as an average headcount is severely limited. Close enough is not good enough. If you are not starting out with a basis of accuracy, then everything you do downstream has the potential to be compromised. Remove the Context of Time There's a better way to represent data for reporting and analytics. Connect transactional events into a timeline Extract the details associated with the events Collapse the record set to provide an effective-dated set of records. The above distills down the number of records to only that which is needed and matches transactional and other object changes which means you can join to the data set at the correct point in time rather than approximating. Time Becomes a Flexible Concept This change requires that you apply a time context at query time, providing infinite flexibility for aligning data with different time constructs like the below Calendar Fiscal Pay periods Weeks Any time construct you can think of It's a simple enough join to create the linkage left outer join timeperiods tp on tp.date between employee.effective_date and employee.end_date We are joining at the day level here, which gives us the most flexibility and accuracy but will absolutely explode the number of records used in calculations into the millions and potentially billions of intersections. For us at One Model, accuracy is a worthwhile trade-off and the volume of data can be dealt with using clever query construction and, of course, some heavy compute power. We recently moved to a Graphics Processing Unit (GPU)-powered database because, really, why would you have dozens of compute cores when you can have thousands? (And, as a side note, it also allows us to run R and Python directly in the warehouse #realtimedatascience). More on this in a future post but for a quick comparison, take a look at the Mythbusters demonstration What About Other Objects? We also apply the same approach to the related objects within Workday so that we're building a historical effective-dated representation over time. Not all objects support this, so there are some alternative methods for building history. Retroactive Changes? Data changes and corrections occur all the time, as we regularly see volumes of changes being most active in the last six months and can occur several years in the past. Snapshots often ignore these changes unless you replace the complete data set for each load. The smarter way is to identify changes and replace only the data that is affected (i.e., replace all historical data for a person who has had a retroactive change). This approach facilitates a changes-only feed and can get you close to a near-real-time data set. I say "close to near-real time" because the Workday API is quite slow, so speed will differ depending on the number of changes occurring. Okay, So How Do You Accomplish This Magic? We have built our own integration software specifically for Workday that accomplishes all of the above. It follows this sequence: Extracts all object data and for each of them it... Evaluates the data flow and identifies where additional requests are needed to extract historical data at a different time context, then... Merges these records, collapses them, and effective-dates each record. We now have an effective-dated historical extract of each object sourced from the Workday API. This is considered the raw input source into One Model, and it is highly normalized and enormous in its scope, as most customers have 300+ tables extracted. The pattern in the below image is a representation of each object coming through; you can individually select the object slice itself The One Model modeling and calculation engines take over to make sense of the highly normalized schema, connect in any other data sources available, and deliver a cohesive data warehouse built specifically for HR data. 6. Data is available in our toolsets or you have the option to plug in your own software like Tableau, PowerBI, Qlik, SAS, etc. 7. One Model is up and running in a few days. To accomplish all of the above, all we need is a set of authorized API credentials with access provided to the objects you'd like us to access. 8. With the data model constructed, the storyboards, dashboards, and querying capabilities are immediately available. Examples: Flexibility - The Biggest Advantage You Now Have We now have virtually all data extracted from Workday in a historically accurate transaction-based format that is perfect for integrating additional data sources or generating an output with any desired time context (even convert back to snapshots, if required). Successful reporting and analytics with Workday starts with having a data strategy for overcoming the inherent limitations of the native architecture that is just not built for this purpose. We're HR data and People Analytics experts and we do this all day long. If you would like to take a look, please feel free to contact us or book some time to talk directly below. Learn more about One Model's Workday Integration Book a Demo
Read Article
Featured
6 min read
Mike West
You may not think your company is yet doing People Analytics but here is one way your employer is already doing so, poorly. Employers routinely use Credit Scores to screen candidates. “47% of employers conduct credit checks to screen potential new potential hires” (Society of Human Resource Management) That is to say, they use a score designed to measure credit default risk and apply it to employment screening on some dubious premise that this may be predictive of success, good judgment or good moral character in the context of employment. What is a credit score? A credit score is a numerical expression of an analysis of a person's credit files, to represent the creditworthiness of the person. Lenders, such as banks and credit card companies, use credit scores to evaluate the potential risk posed by lending money to consumers and to mitigate losses due to bad debt. Lenders use credit scores to determine who qualifies for a loan, at what interest rate, and what credit limits. A credit score is based on a history of financial transactions sourced from credit bureaus. A credit score applies a mathematical "algorithm" to a profile and history of transactions to categorize a loan candidate so that financial institutions can make a better decision about whether or not to loan someone money, and/or to make a better decision about how to price a loan to loan money profitably, with regard to risk.A bank may offer to lend a low credit score candidate a loan but at a higher interest rate. This allows the bank to maintain a certain level of profitability from different risk segments or deny working with certain risk segments all together. Why do employers use credit scores to screen employees? In employment the credit score is presumably used for a similar high level purpose as banks (control risk). In the case of employment you are trying to reduce the risk of a "bad" hire or stated inversely, reduce the risk of not selecting a good hire. Employers have more applicants than available positions and they limited time to consider each candidate, so they want some quick means to make a decision. Or sometimes we make a decision but apply additional rigor through screening devices to prove to ourselves we made the right decision. Directly from a credit reporting firm’s website : Credit Scores can help employers “make decisions quickly and easily when deciding on potential candidates” (TransUnion) To shed additional light on this practice, I bring up a conversation at a recent U.S. Senate hearing… “What is the evidence that there is strong correlation between accessing an applicant's credit history and eventually problems of loss to the employer?” Senator Rosenbaum (Oregon) “We don’t have any research to show any statistical correlation between what is in somebody’s credit report and their job performance or their likelihood to commit fraud.” Erick Rosenberg (TransUnion) Here are some other facts to consider about using Credit Scores for employment screening: - 1 in 4 credit reports have been found to have an error - 1 in 20 have been found to have serious errors - Other issues : “52% of all debt on credit reports stems from medical expenses” CFPB 25% error rate? If you thought your core HR data has problems, maybe it doesn't sound so bad now? Scores are being used as a reflection of "character" but if the medical debt statistic is correct, in half or more cases credit scores may be low because of uncontrollable circumstances - how is that reflective of character? Imagine that as a result of chance circumstances you are in dire need of money and you are also as a result refused a means of obtaining money? Does this practice make any sense? It's predictive but you don't need any math at all to predict this outcome - it's a self fulfilling prophecy - self fulfilling prophecies are convenient if you sell predictions. How do the credit reporting agencies and employers defending this practice? The "all else equal" claim. That is that "all else equal" this is a better way of making a decision than nothing. Is it really? If we can find evidence of errors, systematic bias and NO evidence that there is any relationship with job performance - is this really a better way of making a decision than a coin flip? Or why not do the work to find something else not equal that has less errors, less systematic bias or more evidence of relationship to job performance? Why not find a better way of making things not equal? We have always done it this way or this is how others are doing it. Come on, is this really a good reason? Is this how to make good business decisions? This is not how any great business decisions are made, ever. It seems like a "plausible method" of making decisions by logic or rational argument. Here are some theories that are at least as plausible as the theory that credit reports may be good predictors of performance … Maybe because we really have no idea how to screen a candidate for characteristics that relate to performance without doing this work directly … Maybe because we want to find a way to systematically discriminate against populations that come from impoverished communities, thus discriminating against a high percentage of minorities, without doing so directly… Or we don't delve into the details to see how this math may or may not play out and we don't care. Maybe because we want to eliminate candidates who have high medical expenses and would cost our health plans money, without doing so directly… (This actually seems like the most mathematically plausible scenario to me) I ask again, why are you using credit scores to screen job applicants? How about this, why don't we actually do the work to see what factors drive performance in our organization and/or isolate with data how we can increase the probability of high performance, regardless of starting individual characteristics. To me this is a better way of making decisions and a better way to run a business. Thank you to John Oliver (yes, John Oliver the comedian) for highlighting these issues in a clear, emotionally charged and entertaining way on his show : Last Week Tonight (HBO) – Be advised - wear headphones - there is some language in this that may be offensive and not safe for a work environment. I would argue the practice of using credit reports as screening devices is equally offensive and unsafe. I wish that each employer would put as much time researching the practice of using credit reports as employment screening devices as an HBO comedian did. Seems like it would be even more useful to us to know these things than it is for him, is it not?
Read Article
Featured
12 min read
Mike West
“Half the money I spend on advertising is wasted; the trouble is, I don't know which half.” Henry Ford, Lord Lever, John Wanamaker... People often ask, “Does People Analytics work?” These are smart people, who genuinely would like a straight answer to an honest question. I will sometimes start a conversation about People Analytics with something like this: did you know that in the United States alone, companies currently spend about 7 trillion on employees in Payroll*? My experience has been that how, where, when and why money is spent matters and results will vary on how well or poorly money is invested. Unfortunately having worked in HR for over 15 years I can tell you that more often than not decisions affecting how people are : a.) coincidental b.) arbitrary, c.) biased or d.) political (by this I mean rooted in conflict – the expression of personal or group advantages and disadvantages). None of these imply a reason that is good for doing business, or people or the economy. (*It is probably over 10 trillion when you include the cost of Benefits and other Perks) Before you can say whether or not People Analytics’ works, you need to know what its job is. We can gather from the name, People Analytics is the application of analysis to people. O.k. but what’s the goal of People Analytics? How can you say whether it works if you don’t know what it’s supposed to achieve? In prior People Analytics Q/A posts I attempted a definition - I was looking for a unique combination of words to represent the essence of a complex concept simply without missing what is different about it. This is what I came up with:People Analytics is the systematic application of behavioral science and statistics to Human Resource Management to achieve probability derived business advantages. I chose these words deliberately but admittedly a little too deliberately to roll off the tongue in casual conversation. So let’s take a step back and talk about it. More simply put, People Analytics is the application of analysis to people in a business. For the sake of discussion we are going to put "people in a business" in the realm of Human Resources. To understand People Analytics lets first understand what is the job of HR? Most people think of HR as a series of low-level administrative activities: record-keeping, recruiting, benefits administration or as the legislation compliance arm of the government that sits in your organization. There is an element of truth to this. No organization could exist for very long without some attention to those things, however you think this is all HR is you simply have not been exposed to HR in a successful, large, modern organization. The need for a dedicated HR function, historically, and for any given organization, occurs when we start to go from something small – where you can know everybody and everything that is going on in an organization - to something big – when it is no longer possible for one person to know everybody and everything that is going on. If at some point no one person can see clearly what is actually going on the organization will grow into chaos. The reaction to chaos now and forever will be, “My God we need to get control over this.” e.g. Ultimately successful organizations eventually arrive at, "We need a Human Resource function". At its essence, HR is about controlling the chaos of organization growth. Control doesn’t sound very exciting. A more exciting way of framing it is, if your investors wishes comes true (growth), then Human Resource Management is your reward. HR is a necessity as a consequence of growth, and growth is the primary way businesses derive value for shareholders over the long run, then it might be accurate to describe HR as both a consequence of growth and a cause. Yes, just as eggs and chickens go, so goes HR. It is a great mystery, however in the case Kentucky Fried Chicken I will abdicate that I think the fried chicken came first, and then HR came along later. The idea that having an HR function may be a good thing for an organization begins with the need to create efficient processes to expand the business and eventually evolves into a more expansive appreciation for what intelligent employees and management expect out of an organization. Tending to The Employment Relationship The main thing everybody in an organization has in common is that we are all getting paid– that makes it a job - otherwise we probably wouldn’t be together 8 hours per day, 5 days per week, 50 (+/-2) weeks per year or whatever you do. You may like each other but you probably don’t like each other that much! The other thing we all have in common is that we are at this job voluntarily – beyond acknowledgement that the dissolution of direct slavery is a good thing – we can all thankfully acknowledge labor markets can sometimes be competitive and therefore opportunities for us abound. There are two sides to the equation and both sides make choices. The relationship between an employee and an employer is a rich interaction that can be understood through many different lenses (Psychology, Sociology, Social Psychology, Labor Economics, Anthropology, etc.) however these interactions boil down to decisions and consequences. Decisions are at the crux of our interaction. The leadership of an organization can truly decide to treat people however they want to but they don’t get to decide if people come or go as a consequence – the people decide this. The people also decide the level of creativity and effort they are willing to exert on behalf of an organization. In the short term, none of these interactions may appear to matter - in the long-term it is clear they always do. Achieving clarity in decisions to achieve desired outcomes is the entire point of management. HR wants a seat at the business table, but the truth is they were silent participants all along. Strategic Human Resource Management After we get the basic blocking and tackling of organization under control modern Human Resource Management extends into the macro-concerns of the organization regarding structure, quality of talent, culture, values, matching resources to future needs and other longer-term people issues related to the organization’s plans – we call this group of activities Strategic Human Resource Management. Strategic HRM gives direction on how to build the foundation for strategic advantage by creating an effective organizational structure and design, employee value proposition, systems thinking problem diagnosis, and preparing an organization for a changing landscape, which include new competition, downturns and mergers & acquisitions. Sustainability, diversity, corporate social responsibility, culture and communication also fit within the ambit of Strategic HRM by reflecting chosen organizational values and their expression in business decision-making. (loose credit to the Society of Human Resource Management for this) If that two paragraph description of Strategic HRM sounds like something straight out of a textbook, that is because it probably was. You and I both wish I would have a better mental firewall. Basically, I’m saying that if you are doing it right the purpose of HR is not really about specific activities or compliance; it is about enabling business growth and as you do that it about designing an organization for competitive advantage. Good HR should help extend the life of organizations by helping them extend the reach of what they do or grow better at what they do over time. Here is a brutal fact : over the long run most organizations fail If over the long run most organizations fail – are you getting a clear picture of how good of a job we are do with this strategic HRM stuff? Then again, maybe it is not HR’s fault, maybe HR had great things to say that were not heard. What is the goal of People Analytics? In 3 words : do HR better. In more words: help leaders and employees make better decisions together – reinforcing organization based competitive advantages - which result in sustained organization growth over time. What is the job of People Analytics? People Analytics provides a means to see and explain what is going on inside of an organization. People Analytics provides a framework to give HR’s disjointed practices a reason, coherence and direction. People Analytics also gives HR a more powerful language to communicate with others. Does People Analytics Work? Here is an interesting 18 minute macro way of answering this question in the form of a Ted Talk : The Surprising math of Cities and Corporations : by Geoffrey West Or you can formulate your own answer in less time than that thinking about the following questions… What is it worth to you, to find out what characteristics in a manager lead to a statistically better performing software engineering or sales team? What is the ROI of using math and science to identify a poor manager, that un-checked could influence the organization to do things that result in devastating class action litigation against your company? How much more likely is a person to be motivated to react to criticism of their management style when presented with data that “We asked the same questions of all managers and you are in the bottom 25th percentile on these measures – please explain”? If you believe that the best hiring criteria for success in your organization is intelligence, represented by an Ivy League education, believing they are more successful, and that is not actually true, what is that worth to you? How much additional do you pay for a graduate of an Ivy League Education x number of employees over time? Further, what additional costs to business performance about being wrong about what makes people successful in an organization? Not to mention, what possible business drags and penalties will accumulate when your organization grows less statistically representative of its community as a result of a faulty premise? What is it worth to you to discover that a many million-dollar Benefit program actually does not relate at all to retain employees when that was the primary reason cited for implementing that program? What is it worth to you, to discover that a 10 million dollar benefits program now, will result in a 100 million benefits program later if your organization continues to grow on the same track? What is it worth to you to avoid the consequence to commitment and morale of offering a benefit and not being able to deliver on it, or slowly taking it back over time? Does the current gain in morale from implementing this program, exceed the future consequences of retracting it? What is the ratio of value to the organization? What is the ROI for a hospital to have a 12.5% 1st Year Tenure Attrition Rate versus 25%? What is the difference in patient satisfaction or outcomes? What is the potential cost of Nurse mistakes? What is the reduction in probability of mistakes by reducing the 1st year turnover rate? … Based on my experience with the question above and real organization data, the question “Does People Analytics Work?” is an absurd one. Would you dream of asking a mathematician, “Does math work?” or asking a scientist, “Does science work?”, or an engineer, “Does engineering work?” or a doctor, “Does medicine work?” Someday I hope we will move past this question for People Analytics. ---------------------------------------------------------------------------------------- #PAQA = People Analytics Question and Answer Series What is People Analytics? What is not People Analytics? Why People Analytics? What is the history of People Analytics? What are the key questions of People Analytics? What is the actual work of People Analytics? What are the alternatives to People Analytics? What is the technology of People Analytics? Feel free to suggest ---------------------------------------------------------------------------------------- Who is Mike West? Mike has 15 years of experience building People Analytics from the ground up as an employee at the founding of Merck HR Decision Support, PetSmart Talent Analytics, Google People Analytics, Children's Medical (Dallas) HR Analytics, and PeopleAnalyst - the first People Analytics design firm - working with Jawbone, Otsuka and several People Analytics technology start-ups. Mike is currently the VP of Product Strategy for One Model. One Model takes data from numerous systems and organizes it so that you can measure, predict and influence workforce behavior to effect change. Mike's passion is figuring out how to create an analysis strategy for difficult HR problems. Connect with Mike West on Linkedin
Read Article
Featured
5 min read
Mike West
# 1 Reason people leave google: to connect with their personal calling to change the world. Ironically, it is also the number one reason people join Google. To be fair, I'm a data guy and I don't have access to their data anymore - so consider these an educated guess. It's great that I haven't seen the data in a long time (fuzzy) because I can speculate without assertions I am sharing secrets. Most importantly you can read the things that Laszlo Bock has to say about what attracts, motivates and retains employees directly from him. I do not believe I am off message at all. Across a number of People data projects we get a big theme : you best attract, engage and retain people by connecting with their passion. The role of the company is to leverage assets to provide support to help people materialize dreams because some of those crazy bets are going to someday hit. Get out of the way Google's head of Engineering, Alan Eustace, once said, "One thing is certain about the next big idea, I probably won't have it." (I paraphrase from memory). They work hard to attract, engage and retain talented employees for this reason. They pay more than lip service to these ideas - they invest big in them. As a result of this effort they have much lower average overall turnover than most companies, and really really low key employee turnover, however even Google can't keep them all. Keeping them all might not even be the right goal (for them, the company or for the world). Here is a human story I worked with the now CEO of Omada Health, Sean Duffy, years ago in the Google People Analytics group. We were analysts. I recall the moment when was accepted to Harvard Medical. He had excitement on his face, I asked him what was going on and he showed me the letter. Looking in his eyes I could see it and said, "I'd bet a million dollars you take it". I was right and I was wrong. He went, but soon thereafter he met up with some other guys and they decided to start a company together. They have been working on it for over 5 years, they have iterated, they raised a lot of money from VCs, and I think they have something. Most importantly I believe they have a unique business model. If I understand their business correctly they don't take money for services up front. They get paid by changing population health outcomes, or if not that precisely, at least there is some kind of reconciliation based on outcomes. They can make money from interested insurance companies, large employers and now, it sounds, from Medicaid/Medicare. This might just be the future of medicine. Historically Google has hired people who were overqualified for their roles : a.) because Google is attractive enough that they can, and b.) because they were betting on the future of this person with the company, not a temporary functional skill-set. The result of that is that you end up with too many individuals in small roles who are chasing too few big roles and you get a jammed up talent funnel. If you are stuck in small thinking, there is no solution. You choose winners and losers. You acknowledge that some of your good people are going to go, and that's o.k. You wish them well and try to maintain good relations through a strong alumni network. However, Google doesn't think small. The most interesting thing for me is to watch the big business changes at Google that I believe may directly stem from people data. One word, many letters : Alphabet. What other company would do this? Crazy? No way. The beauty of Alphabet is many fold. By splintering into many parts Google reaps the following benefits: 1.) Increasing the number of available internal paths to big roles, 2.) clarification of decision making, and 3.) clarification of impact - on an entity level but on an individual level. Opportunity can expand infinitely if you think big. Opportunity + Talent + Resources = an economic engine for the future. Don't take my word, here is what Google Chief Finance Officer said himself : “This belief was the impetus for our organizational structure, which enhances focus on opportunities within Google and across Alphabet, while also pushing our leadership to extend the frontier that we are addressing,” said Alphabet’s chief finance officer, Ruth Porat, in the company’s last earnings call." This quote is taken from the The Atlantic's, "Alphabet, Jigsaw, and the Puzzle of Google's New Brand". Stop a minute and think about this. Google's Chief Finance Officer is talking about organization structure and leadership opportunities as a core component of business strategy. Yes, that just happened. No doubt about it Google's Management team is different. They are also smart, they mean business AND they get people. This is The_New_HR. Say what you will about Google, I believe they have this people growth driver thing pretty much wrapped up if other companies don't get on it fast. Alphabet will allow Google to take apart any industry they want to. Consider this fair warning.
Read Article
Featured
8 min read
Mike West
January 18th, 2016 In the 53 years that has passed since Dr. Martin Luther King Jr. delivered his I Have a Dream speech America has clearly come a long way, however the work of freedom is not finished. If you listen carefully the words of Dr. Martin Luther King's speech are still as relevant today as ever. While he was speaking primarily to obvious injustices against people of color at in his time - and some of these measures have progressed - specifically, black people suffer less lynchings and can actually directly influence the political process - yet, upon close inspection of facts, it is obvious there is a startling and frustrating lack of progress today. On nearly all fundamental metrics of prosperity we see massive pernicious disparity by race. How could we have worked so hard and arrived in a place not really that much different from where we started? I will not insult you with an easy or absolute answer : the 3 step process for racial equality or the 5 reasons things are or aren't as bad as some would claim. Difficult problems cannot be satisfied by the sticky residue of low hanging fruit. I will share my few observations. I don't know if the world has really become more complex, but it certainly seems like it has. Over my lifetime I have worked for over 10 different organizations in professional and nonprofessional roles and have never met a blatant racist at work (and very few even in my personal life). How much easier it would be to confront the obvious ignorance of racism directly. This type of problem seems it could be tidied up promptly. However that is not our work today. The problem we are dealing with is much much more insidious than this. I can find no racists, but of "unconscious bias" I could dredge up volumes for you. Here is a tidy summary published by the New York Times : Racial Bias, Even When We Have Good Intentions. Lacking an actual human descriptor - we are fighting an elusive thief that we are forced to call some inhuman name "unconscious bias". Who believes in something you cannot see and a story you cannot tell. Imagine the frustration of toiling hard each day, stockpiling the fruit of your labor for tomorrow's meal and finding each night some unknown stranger steals it you in the quiet of the night. You have no way of catching, identifying or accusing this stranger and nobody believes you. Your lack of food must be explained by something else - perhaps you are lazy. You have spoken about it for so long, and this injustice so unbelievable that sometimes even you wonder if you have gone crazy. This is a description of the actual horror of unconscious bias. One of the things we are doing now, different than in the past is that we are beginning to face this thief directly. It turns out that clever people have devised clever ways of actually catching the thief in the act. Here is a video of great work conducted by People Analytics at Google, speaker is Brian Welle, a former colleague of mine : https://youtu.be/nLjFTHTgEVU . Brian will blow your mind. Fundamentally, at its essence People Analytics is about using clever research methods and data to reduce mistakes of human bias, because bias causes us to make worse decisions for our business than we would have made with a more perfect understanding of truth. It was only a matter of time that People Analytics would turn its attention directly upon matters of diversity too. While justifiable in its own right as an effort for "fairness" or for "the law", but we actually don't stand against bias at work just for these reasons - we actually benefit from truth too. This is not benevolence or charity - that demeans it . The most astounding thing about working on issues of bias is that when we make decisions with less bias we benefit directly too! If you did not get this point from your reading of the book "Moneyball", or watching of the movie, go back and watch it again - you missed an important detail. They did not do this "diversity thing" because we feel sorry for people who look different or throw the ball weird - it turns out that if you like winning people who throw the ball weird might make great teammates! This is the beauty of all things good and eternal. In truth there is actually no threat to anyone. Open up the door, let every truth come in. The house only expands. We may never reach a place of perfect truth or perfect answers on this earth, but as I am reminded by Dr. Martin Luther King Jr. I too refuse to believe that there are insufficient funds in the great vaults of opportunity of this nation (as he addressed the United States of America at Washington DC in 1963) I leave you with his words, which even 50 years later, never fail to bring tears to my eyes. We are not where we wanted to be, but Dr. Martin Luther King Jr. is no less prophet if we open our minds, hearts and ears. "In a sense we've come to our nation's capital to cash a check. When the architects of our republic wrote the magnificent words of the Constitution and the Declaration of Independence, they were signing a promissory note to which every American was to fall heir. This note was a promise that all men, yes, black men as well as white men, would be guaranteed the "unalienable Rights" of "Life, Liberty and the pursuit of Happiness." It is obvious today that America has defaulted on this promissory note, insofar as her citizens of color are concerned. Instead of honoring this sacred obligation, America has given the Negro people a bad check, a check which has come back marked "insufficient funds." But we refuse to believe that the bank of justice is bankrupt. We refuse to believe that there are insufficient funds in the great vaults of opportunity of this nation. And so, we've come to cash this check, a check that will give us upon demand the riches of freedom and the security of justice." - Dr. Martin Luther King Jr. Full I Have a Dream Speech : https://youtu.be/I47Y6VHc3Ms ---------------------------------------------------------------------------------------- Who is Mike West? Mike has 15 years of experience building People Analytics from the ground up as an employee at the founding of Merck HR Decision Support, PetSmart Talent Analytics, Google People Analytics, Children's Medical (Dallas) HR Analytics, andPeopleAnalyst - the first People Analytics design firm - working with Jawbone, Otsuka and several People Analytics technology start-ups. Mike is currently the VP of Product Strategy for One Model - the first cloud data warehouse platform designed for People Analytics. Mike's passion is figuring out how to create an analysis strategy for difficult HR problems. Connect with Mike West on Linkedin ---------------------------------------------------------------------------------------- I’m putting together a series of live group webinars where I will be revealing a process for dramatically increasing probability of success of People Analytics - building on a career of success and failures (Merck, PetSmart, Google, Otsuka, Children's Medical Dallas and Jawbone) - and applying new ideas I have developed over the last few years applying ideas from Lean to People Analytics. The goal of this webinar series is to engage a select group of qualified early adopters, who have access to an organization, are willing to apply the process, report back how things are going, and work out the bugs out together. This group will have opportunity to share their findings with the broader People Analytics and HR community, if you choose. If you have interest in participating in the webinar series, let me know here: (http://www.misc-peopleanalytics.com/lean-series) And if you know anyone else who you think would, please let them know too!
Read Article
Featured
17 min read
Mike West
Did you think I'd crumble? Did you think I'd lay down and die? Oh no, not I. I will survive. - Gloria Gaynor, "I will Survive". 2002, White House Station, New Jersey, Merck, Organizational Learning I am helping to organize a launch meeting for our new Performance Management process and we need to stand up a website. I studied Sociology and Psychology and finally Human Resource Management with a natural penchant for Abstract Mathematics. My skills don't extend into HTML - in 2002 we didn't have SquareSpace - At Merck we were lucky we had Internet Explorer installed. I was invited to go talk to this guy in IT about creating a website for the event. For reasons I will describe, I'll never forget this meeting. To protect his identity I will call him "Stan". I get to Stan's desk area/cube. First, I see this row of stuffed animals. Let me just stop here and provide context. At Merck, at this time, we wore suits and ties to work every day – presumably in case a member of Congress, a foreign dignitary or a nobel laureate just stopped into the office. We were East Coast Pharma. We were a 100 year old organization and one of the most productive research institutions in the world. In our minds, we saved lives and we put billions of dollars in the economy, and we proud of both - we were buttoned up. Guys like me wore single-breasted jackets. If you were at the top your suit was double, maybe triple breasted. I don't recall what Stan was wearing precisely, but by the row of stuffed animals on his desk, I knew this guy was a strange animal. I was curious how Stan got away with this. Stan told me to come in, pull up a chair and sit down. I sat next to him in his cubicle, side by side staring at a couple of computer monitors, stuffed animals behind me. He had at least three computer boxes. He asked, “what do you want ?” As I recall, at the same time I was speaking he was typing in HTML, occasionally with diversions to chat while he kept typing. With swift keyboard strokes he would move seamlessly between computers, moving things from a design environment, to test, to production. Somewhere in this exchange I thought, "I don't care about the stuffed animals, this guy is pretty cool." I thought in my mind - if I ever go start a company this is the first guy I would take. For fear of losing access to him I told no one else about this. I don't know if I actually ever said this to Stan directly. Time went on, and Stan and I went on our separate ways. A few times after I left Merck I tried to engage Stan in projects I had going, but it has never worked out. I don't know if he kept up with changing technology or not. My learning from Stan is that “most of us use computers; some of us use them differently.” Excel - “most of us have used Excel; some of us use it differently.” I share the story about “Stan” because it makes what I am going to say about Excel just a little more vivid: “most of us have used Excel; some of us use it differently.” I want you to understand the degree to which I understand, appreciate (and even love) Excel before I describe why it may be the most dangerous business application of all time. This story about Stan with HTML foreshadowed how I would some day use Excel. It didn't come easy. Nobody taught me how to work with HR data in Excel. The way I use Excel is different than most people I have ever observed working in Excel. Granted I got there somewhere after spending over 10,000 hours on it over 15 years. I sit sheets like data stores, with pivot tables that feed into lists that other tables sit on, and lookup functions that move data around, transform it into whatever I need it to be, that feed into downstream analysis, and finally, charts. I have figured out how to do multidimensional reporting in Excel. I work data through recursive algorithms in Excel. I use add-ons to run a variety of statistics. I take the charts provided in Excel and strip them down to the bones, rebuilding them into something beautiful, apparently never imagined by Microsoft. When I am working with Excel, I often don't even see the detail of the data – at times I might as well be operating in another dimension. Having said all this, I'm sure there are people who have even more advanced Excel skills than myself. I'm not saying I'm the greatest - simply that there are differences between how people use Excel, so when we talk about Excel it is important to keep this in mind. Excel can be a very powerful application. 2006, Mountain View California, Google, People Analytics There was a time at Google where I was working with employee data in Excel and developing ways to run out reports by all segments we had in the data. I had spent probably 150 hours working iterating report trials off this dataset over several weeks. I started on the Google bus on the way into the office as I travelled from Berkeley to Mountain View, worked all day at Mountain View, stopping only for food and coffee, then continued working on the Google bus on the way home at night, collapsing into bed when I arrived, and starting over again the next day. At one point, while working with the structure of the data, reaching that point just before it was ready to share and it was like the sun rising on a very dark night. In some way it was like seeing God. That or I had reached the ceiling for caffeine consumption - one or the other. If I did not see God at the very least experienced what it must be like to see pure truth. The best example I can provide of what this is like, it is like standing on the edge of the Grand Canyon or an ocean or looking at the stars in complete darkness on a crystal clear night. I have not since recreated a feeling this vivid at work - apparently I lost track of the specific Excel function for this :-) - but what remains with me today is an appreciation that there is a truth encapsulated in data and a beauty in its mathematical structure, which also happens to be powerful. If you spend enough time in it, you might see it. Don’t take my word for this, here representative thought from a series of quotes about the beauty of mathematics: “It seems to me now that mathematics is capable of an artistic excellence as great as that of any music, perhaps greater; not because the pleasure it gives (although very pure) is comparable, either in intensity or in the number of people who feel it, to that of music, but because it gives in absolute perfection that combination, characteristic of great art, of godlike freedom, with the sense of inevitable destiny; because, in fact, it constructs an ideal world where everything is perfect but true.” Bertrand Russell (1872-1970), Autobiography Thoughts about the beauty of mathematics The way that Excel is different from most other applications for working with data is that in Excel, you can actually directly see the data you are working with. There are a number of other reasons why Excel is the most used business application of all time, but I won’t bore you with the nuances. This quote sums it up: “Microsoft Excel is one of the greatest, most powerful, most important software applications of all time. Many in the industry will no doubt object, but it provides enormous capacity to do quantitative analysis, letting you do anything from statistical analyses of databases with hundreds of thousands of records to complex estimation tools with user-friendly front ends. And unlike traditional statistical programs, it provides an intuitive interface that lets you see what happens to the data as you manipulate them” (The Importance of Excel) I love my friend excel, but I'm about to shake his hand and then pummel him. The main argument against Excel has been that the things that make Excel great are also its biggest downside. First, Data Quality: Excel makes it too easy for people to make mistakes Excel makes it too easy for people to lie For starters, while it is incredibly easy to get started making spreadsheets, it’s also incredibly easy to make mistakes that cost companies millions (or billions) of dollars. In 2008, University of Hawai’i professor Raymond Panko published a summary of 13 field audits that checked spreadsheets used in ‘real-world’ environments. His analysis found that a whopping 88% of the spreadsheets had errors! In evaluating possible solutions to the spreadsheet errors he described in his 2008 paper, Professor Panko wrote: “… few spreadsheet developers have spreadsheeting in their job descriptions at all, and very few do spreadsheet development as their main task.” One problem is that since everybody has at least some knowledge of how to use Excel, many people misjudge their own expertise, as well as that of others. This is different from when how we hire and judge software developers. Business managers don't know that there is a problem (actually, lots of problems) with spreadsheets, while IT regards spreadsheets as falling outside its jurisdiction. So spreadsheet management falls into a black hole. While Excel the program is reasonably robust, the spreadsheets that people create with Excel are fragile. There is no way to trace where the data came from, when, and what was done to. The biggest problem is that anyone can create Excel spreadsheets—badly. Because it’s so easy to use, the creation of even important spreadsheets is not restricted to people who understand programming and do it in a methodical, documented way. There are a number of public examples of Excel mistakes, some with substantial impact 2012, London, "The London Whale" The problems of Excel apply to anything and anyone who’s working with data in Excel, not just HR. Here is a description of the particularly high impact example at JP Morgan: Microsoft Excel Might Be The Most Dangerous Software on the Planet. “After the London Whale trade blew up, the Model Review Group discovered that the model had not been automated and found several other errors. Most spectacularly, “After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .” The explanation in English: someone at JP Morgan was running bets (to the tune of tens of billions of dollars) in Excel and there was an error. There may have been other negligence or nefariousness going on as well, but I found the most outrageous part of the story that this sophisticated derivate work was completed in Excel in the first place. Stupefying actually. At the time I was getting paid 1000 times less money to do similar work in Human Resources- you have to ask yourself, "Maybe I should have considered a different profession?" "I could totally screw up derivatives, maybe even 1/3 as bad as this "whale guy". The other ways that Excel falls down are: Difficulty seeing workflow (e.g. how the data goes through stages) Difficulty documenting workflow, process audit trail. Difficulty with dependence - difficulty transitioning spreadsheets from one person to the next. Stale data and/or constant rework as a result of stale data. Difficult to see the real cost of manual work in Excel being performed across the organization. Inability to secure data. 2013, Austin, Texas, ACMETech, Workforce Analytics Very few people know that between the time that I worked at Children’s Medical in Dallas and started my own consulting company, two and a half years prior to joining One Model, I worked for a technology company in Austin. Let’s call them AcmeTech. AcmeTech lured me from a children’s hospital in Dallas with higher pay, better benefits, a well stocked micro kitchen, free lunch on Fridays and a ping pong table. I felt bad leaving the children. Little did I know at the time, as a result of this decision I was going to hell. When I interviewed with AcmeTech I described the important analytical work I had done in HR at Merck, PetSmart, Google and even at a very modestly funded non profit children’s hospital. The emphasis of this work was in automating analytical workflow AND then spending my time in more sophisticated and high value analysis like exit prediction models. I thought we were on the same page. Well, I started with ACMETech and soon learned that this, in fact, we were not on the same page. I was expected to create weekly HR reports for the division I supported in Excel. We were dumping data out of WorkDay into Excel, aggregating into metrics and reporting by segment. I have a history creating prototype reports just like this in Excel but these always were temporary, not long term, solutions. At ACMETech this is something they had been doing for years and there were some unique nuances to the way they were doing it that prevented full automation. My predecessor diligently showed me how to copy data from one sheet to another, change a series of things (to be recalled by notes or by memory), check for these other things that may or may not go wrong, then publish the reports out by email. A single report would take me a full day to complete and there were several different versions of these reports for different stakeholders. There were a dozens points of possible failure. There were five of us on this team doing the exact same reports for different divisions. When I raised this issue to my manager and my ‘managers manager’ I was told, “We want you to keep doing the reports the same way they were done by your predecessor.” This is my problem: somehow I had gone from at one point of the time one of the most brilliant People Analytics teams in the world to now something slightly above “human cog” in a car headed to nowhere, driving off a cliff. Here were my views on ACMETech's Excel Based Workforce Analytics Process: There was not much value in these reports as produced, relative to other work we could be completing on behalf of the organization. The incredible waste of time and money in this approach - not to mention life effort. When things eventually went wrong or people were through with this it would be our neck on the line (in the case of my division my neck specifically) The way we were running these reports were affecting the quality of experience for the recipients of the reports. I knew there were better ways that would save ACMETech HR time, money and credibility that could be put in place very quickly. My mind just didn’t work the way we were running these reports - I could operate more effectively in other capacities. At the end of the day I went back and said, “You hired me for my expertise, there is a better way.” The reply was, “Do it our way or leave”. My reply, “o.k. then”. That’s when I decided to go start my own company. I wanted to work with people who really wanted to work with me, or not at all. I acknowledge that ACMETech HR can choose to do whatever ACMETech HR wants to do and that is fine, but if you look at what they were spending each year on headcount, turnover and hire reports for their organization it was at least $500,000 and if you calculate that over 5 years they have spent at least 2.5 million for a very basic reporting framework - without any real semblance of advanced analytics we know as People Analytics and full of holes. This, my friends, is why Excel in HR is dangerous, and a great case study for why you should consider an alternative solution for analysis and reporting! There are a variety of options today, One of which is One Model - which in full disclosure I recently joined - I'm a little biased. Other options are out there: seeWhat Your HR Analytics Technology Sales Rep Doesn't Want You to Know. ---------------------------------------------------------------------------------------- Disclosures: This is a "Gloves Off Friday" post Mike West is a bad man Mike West writes way too much about People Analytics Mike West is currently VP of Product Strategy for One Model ---------------------------------------------------------------------------------------- Who is Mike West? Mike has 15 years of experience building People Analytics from the ground up as an employee at the founding of Merck HR Decision Support, PetSmart Talent Analytics, Google People Analytics, Children's Medical (Dallas) HR Analytics, andPeopleAnalyst - the first People Analytics design firm - working with Jawbone, Otsuka and several People Analytics technology start-ups. Mike is currently the VP of Product Strategy for One Model - the first cloud data warehouse platform designed for People Analytics. Mike's passion is figuring out how to create an analysis strategy for difficult HR problems. Connect with Mike West on Linkedin ---------------------------------------------------------------------------------------- I’m putting together a series of live group webinars where I will be revealing a process for dramatically increasing probability of success of People Analytics - building on a career of success and failures (Merck, PetSmart, Google, Otsuka, Children's Medical Dallas and Jawbone) - and applying new ideas I have developed over the last few years applying ideas from Lean to People Analytics. The goal of this webinar series is to engage a select group of qualified early adopters, who have access to an organization, are willing to apply the process, report back how things are going, and work out the bugs out together. This group will have opportunity to share their findings with the broader People Analytics and HR community, if you choose.
Read Article
Featured
8 min read
Mike West
"Life is a struggle, and then you die" So go make something of it. Work on something important and watch over these things. The 5 Reasons Most HR Analytics Efforts Stall: #1. Not having enough precision on what is the right problem to focus and what questions you need to answer to solve that problem. The typical fail is that you will spend a huge amount of time, money and effort to get a HR reporting environment set-up but downstream users do not use them. People say the information is nice to have - they just don’t have time to go look at your reports. Sometimes the problem is that the information has little relationship to important decisions, or little bearing on the work that anyone is doing. Often the people supported will request an infinite assortment trivial changes in the desperate hope that each change will produce a better result. Or with no specific reason provided you and your solution just go from hero to old hat over night - and you are left to wonder what actually went wrong. The problem is that you spent your resources and time working on the wrong problems and questions. More could have been accomplished with your time and effort had clarity been achieved at the outset. #2. Not having all the right data you need to answer the questions you want to answer. The worst possible outcome of analysis is to produce a statistically significant finding that increases certainty in a wrong answer. This is a common outcome of the “missing variable problem” (the unknown unknowns that wreck most analysis) This is some portion of the 80% of the variance your model did not explain. You ran the analysis but did not include the right control data, so you get an answer, but you get a wrong answer, and you have no way of knowing you got a wrong answer. Sound like a nightmare? This is not a nightmare, this is a real problem. The second worst possible outcome is when you do all the work and don’t achieve a statistically significant finding but could have produced a finding if you had included the right variables. In either case, not having a basic theory that would explain what variables to include in the analysis results in you never achieving a satisfactory outcome from your effort or you may double or triple the hours to reach a conclusive answer because many passes are required. These problems are why we pay university scientists the big bucks. Big Bucks!? O.K., not really! University scientists try harder because they know their work will be 'Peer Reviewed' by other really smart people who also know something about this topic. We don’t have this check inside organizations - we have non experts reviewing the work of experts. Major danger. #3. Expecting technology to solve the whole problem (absent analysts). You have made an important investment in supporting technology, but you may not get anything of lasting value out of this investment because you did not factor in the cost of acquiring (or creating) skilled operators of this technology. It is as if you have this wonderful piece of machinery sitting in idle. The success you have with analytics is dependent on the experience and preparation of the people working the analysis. You can achieve two different analysis outcomes with two different analysts, using the same technology! The worst part - if you get it wrong, how would you even know? Between different analysts you will find different choices about what data to include, chosen research method, as well as differences in skill in executing chosen method. Clearly, you need to think about analysts, but you must also think about the rest of the organization. It does not help you to have this group of "really smart data people” and nobody else who knows how to use their work. You need everyone on the same page on where we are trying to go, what everyone's role is, and how it all fits together. #4. Expecting your analyst to solve the whole problem (absent the right technology & support). Analysts are evaluated based on results. Some other HR roles can produce activities (implementing programs, policies, processes and systems) and we declare victory at the conclusion of the activity without respect to impact (which conveniently is never measured). Success is defined as completion of the project on budget and on time, then on to the next. Analysts do not have this privilege! For Analysts, the proof is in the pudding. If you tell the truth, "based on our data and the tools I have I found nothing of lasting significance to you" - your reward is you don't get invited back to the meeting. Analysts either produce very little value and stick around (satisfied with the activity for pay, as opposed to outcomes) or they leave for another opportunity to do better analysis. They either have a fire in their eye or they don't. You want the one who cares, or don't bother even starting. You have made an important investment in a person, but you can get nothing of lasting value out of this person without providing the tools and support they need to complete their work. Managing an environment like this is difficult, but not impossible - it requires skill and care. #5. Expecting results without someone putting in hard work. Your typical project management wisdom applies - choose one out of three : time, quality or cost. Every new question you want to answer will involve investment in new data collection, cleanup, transport, joining, reshaping, statistics and figuring out how to best communicate the results. We inevitably want automation of routine analytical workflow, but there is a first and second priority constraint : what should be made routine? How can it be made routine? We (technologists) will try our best to design out of this, but the first pass is best handled by a human - this will be hard to get around. It will fail if no one has put in the work. This doesn’t necessarily mean you have to do all the work - or even the hard work - just that somebody does and there is no way to escape this cost. You can bring in consultants to do the work, you can hire enough people in your organizations to do the work, or you can buy packaged solutions that help with part of the work. In this you will be making big trade offs on time, quality of delivery and cost. Beware - no silver bullet will kill this beast. You must make a commitment to ongoing refinement of the analytical process or you get an analytical process that really does nothing for you. If you get into the real day-to-day work of HR Analytics, the People Analysts are dealing with data that generated for some other purpose that does not conform to basic needs and expectations of our existing purpose. The best way to understand what must be done to automate an analytics workflow is to have someone work through the analysis one time to understand what data is there, what is not, what is wrong, and figure how what is there must be improved for successful analysis to occur. Often we implement expensive reporting solutions on the hope that these will produce useful insights. Why invest in automation (repeatability) if you can not achieve a useful finding through a manual effort? Hope? Hope is a great attitude to apply to all situations, but not a great strategy. Why not run through it manually one time and figure out if it is worth automating? The most important question is - when you got to the end of it all manually, do you end up with a report that is useful to the organization? If you did, great, now is the right time to make decision about automation. ---------------------------------------------------------------------------------------- People Analytics is difficult, no doubt about that, but... I’m putting together a series of live group webinars where I will be revealing a process for dramatically increasing probability of success of People Analytics - building on a career of success and failures (Merck, PetSmart, Google, Otsuka, Children's Medical Dallas and Jawbone) - and applying new ideas I have developed over the last few years applying ideas from Lean to People Analytics. The goal of this webinar series is to engage a select group of qualified early adopters, who have access to an organization, are willing to apply the process, report back how things are going, and work out the bugs out together. This group will have opportunity to share their findings with the broader People Analytics and HR community, if you choose. ---------------------------------------------------------------------------------------- Who is Mike West? Mike has 15 years of experience building People Analytics from the ground up as an employee at the founding of Merck HR Decision Support, PetSmart Talent Analytics, Google People Analytics, Children's Medical (Dallas) HR Analytics, andPeopleAnalyst - the first People Analytics design firm - working with Jawbone, Otsuka and several People Analytics technology start-ups. Mike is currently the VP of Product Strategy for One Model - the first cloud data warehouse platform designed for People Analytics. Mike's passion is figuring out how to create an analysis strategy for difficult HR problems.
Read Article
Featured
15 min read
Mike West
“This the real world, homie, school finished. They done stole your dreams, you dunno who did it.” Kayne West (no relation) A New People Analytics Blog Series : Gloves Off Friday! At the moment, in People Analytics I see a lot of the same keywords repeated over and over. Two examples: insights and storytelling… I find myself thinking: It is great you are using the right words. These are what we want out of our analytics system, and great claims, but what do these words really have to do with your application today? Are these words real or are they hype? The first hard truth is this: most of the time enterprise analytics system keywords are just marketing hype. Sell the people what they want to be sold. Keep in mind if an existing system has any real market traction it probably was designed well before we even started using the keywords we use today. If the product was designed 5-10 years ago, you might ask: so really, what has changed? Are these just old solutions packaged a new way? If these systems have existed all this time, why are they suddenly going to help HR in a different way now that they didn’t in the past?” My suggestion is this - when you peer under the hood of a reporting system look at the following: 1.) make sure you are finding what would constitute viable insights or stories to substantiate the insights and storytelling claim. 2.) make sure you can find examples of insights you could only derive in this system (not something you could substitute almost any other system of the same family for to achieve the same result) 3.) make sure that the way these insights are achieved will work in the real world (that it based on clear and accurate assumptions and with a process that is scalable) I’m might be going out on a limb by myself, and maybe I could get fired for saying this, but I will stand on this view: at the moment, insights and stories are mostly functions of human beings – not systems! Let me say it again: insights and stories have mostly to do with the observations of the operators of systems, analysts; little to do with the systems themselves. My position could change someday, but for now I’m not holding my breath. Instead, when you review a potential solution you should ask better questions: How does this application enable my analysts to derive insights in a different & better way than any other application? What do analysts think about this application? The second hard truth is that the problem addressed by the analytics systems of today is the efficiency of the analyst at producing the insight, not the insights. Despite differences in features and presentation, there really is not a lot of meaningful difference between most broad use case enterprise reporting applications in the range of possible insights produced. If you see that the essence of the problem these application are designed to solve is the efficiency of insight production, you will know how to better prioritize your decision. By nature the principles that underlie the use of technology are: automation, repeatability & scalability. How does this application support automation, repeatability & scalability of the analytical process? Where does the data come from? How is the data loaded? How do we deal with the unique nuances of our organization and quirkiness of our data? What do I do when we change our underlying systems? How can this application adjust to changes? What level of expertise is required to do the work? Who will do the work? Pay very careful attention to those features that promote a sustainable path to insights (with less manual work per insight produced) - producing ongoing long-term efficiencies in the analytics process. The devil is in the details. Ask the people who do the work what they think. This statement is not intended to overemphasize the importance of efficiency over all else– I’m just saying let’s be clear about what real problem we are solving for when we implement a technology system – the alternative is a disaster waiting to happen. The system will eventually roll out and it will eventually be held to the standard of how it was promoted. What I mean is this : if the assumption is that after this is implemented there will be no analysts required, does the system actually produce insights without an analyst? If the major difference in this application is how it can bypass the analyst to get the insight directly to the downstream user, you must observe, does the downstream user take on the work to go in to see those reports? Do the reports viewed by the downstream user translate directly into any useful action? Is the insight better than what could be achieved with the assistance of a specialized analyst. Be honest. If the answer to any of these questions is no, but you sold it as so, you will be tearing that solution out in a few years. Fool me once_________, Fool me twice _________? The third hard truth is that THERE ARE REAL DIFFERENCES in how the enterprise analytics systems approach efficiency in the production of insights. The first great simplifier - consider, where is the system maintained and delivered? Is it an “On Premise" or “Cloud” solution? The keywords here are: Cloud, Software as a Service (SaaS) This is the first major branch in your decision tree. Cloud and SaaS are not very human words and at this point already feeling a little tired out, but are still very important to pay attention to. There are technical nuances to the definition of cloud but in layman's speak essentially we are getting at is: are all customers on the same instance over the internet (cloud) or does each customer maintain their own instance on their own servers or desktops (on premise)? Software as a Service usually goes together with cloud - this refers primarily to the method of payment. With a SaaS solution you are renting the software, rather than buy it. My opinion. If you are not moving to the cloud “You are going the wrong way!…” For fun, here is a great wrong way clip – https://youtu.be/_akwHYMdbsM - I just love John Candy and Steve Martin together in this movie. For example - there is a reason Google just pulled out their homegrown Human Resource Information System (HRIS) –GoogleHR (GHR), which giant teams of Google engineers had worked on for over 10 years, replacing it with WorkDay, a cloud, Software as a Service (SaaS) HRIS. The reasons are: A.) Your business is XYZ, not whatever this is. Unless Google planned on getting into the HRIS business, Google had to ask, what the hell are we doing fooling around with HRIS? A very good question. B.) You will spend less on a cloud solution than an on premise solution. The cost of cloud infrastructure is spread among all customers, as opposed to a single entity. Fundamental economics, 9 out of 10 times cloud will be a better value than on premise. Also, because you are billed for your use of this software over time, if somewhere down the road you don’t like it you can go with another solution. You just turn it off! This is a lot easier pill to swallow than a PeopleSoft or Oracle implementation used to be. C.) Cloud companies are faster innovators. Cloud companies have a single instance to invest continuous innovation in – consequently they push more frequent updates out to all customers on one platform - therefore they are faster innovators. Google is a cloud company too, this fact is too close to home to miss. Technology is evolving so rapidly - if you buy something that sits on premise (or build it), it will be out of date before it is fully implemented. ------------------------------------------------------------------------------------------- By the way, I’m NOT for or against WorkDay specifically. WorkDay is just an example the overwhelming trend in HR going into the cloud. We were not sure if we wanted our HR data in the cloud at first – now the market is tipping to the cloud dramatically! There are a number of different cloud HRIS product options. Even the old on premise providers (Oracle and SAP) have options that are cloud now. WorkDay is just a working example of an HR application in the Cloud that the market has already wrapped its mind around. -------------------------------------------------------------------------------------------- If you are looking at an enterprise analytics solution that is not in the cloud and not delivered as software a service – and you don’t have a really unique good reason for this - you are probably making a mistake. The next consideration to pay attention to in HOW is : what functional use case or domain was this enterprise analytics system designed for? This is the next big divider – here is where we get into the nuances of important less obvious choices you will be making. The second great simplifier - is this system designed for generalized purposes or specific? This is another major branch in your decision tree. Option A : a generalized analytical system that can theoretically be applied to any analytical problem (but that is not designed specifically for any). Option B : a solution that is designed specifically for a particular domain, customer type, or use case. To use the HRIS example again, back when we were actually debating questions like this you could a.) build your HRIS system (an HR database) on a generalizable database structure – say generic Oracle – or b.) go with a database designed specifically for HR - PeopleSoft, Oracle HR, SAP HR, Lawson HR, Ultimate Software, WorkDay, etc… It took us time to figure this out, but the market decided that buying a system designed for HR is much better. Do you really want your IT team to be learning, following and servicing obscure changes to payroll, compensation or arcane HR needs that ultimately drive database design? That argument was long ago concluded. Overwhelmingly, with no uncertainty, the big girls and boys do not build and service their own HR databases. It turns out that what you are using the database for impacts important design decisions! It also follows out that customizing a generalized solution for an HR purpose is overwhelmingly more expensive over time. (because of labor costs and other problems unforeseen at the outset). Option A : Generalized Analytical System. Option A - Generalized, may on the surface look less expensive because it is a single solution that can applied across many functions, however you have to factor in the labor costs to bend it to the reality of each business function and their needs and maintain that. You won’t want your critical IT and Software Eng. teams working on HR stuff, so don’t do it. HR problems are notoriously needy and difficult. Stay out of it. The big costs in technology are not in software, they are in the design and set-up. For example, I can buy a single license of Tableau for $2000 (+ $100K+ on a Tableau server to distribute that report over the company intranet with security), but to apply Tableau to my HR reporting needs I might actually “spend” $300K on IT labor for build of my ETL (extract, transform, load) and Data Warehouse path, which must occur prior to delivery of the data into Tableau. After this I will probably spend another $100K in labor to get my Tableau dashboard designed to do what I want it to and to look good. Was this actually a $2000 solution? No. Not counting the server (presumably used by other business functions outside HR) the solution actually cost me $402K, and possible a lot more. I will go through those labor costs several times while iterating towards the right set of metrics and reports for HR on a generalized analytical platform. Who will support me when I need to change the ETL? I have been involved in one way or another with these generalized solutions four times over my career: Cognos at Merck, MicroStrategy at PetSmart, MicroStrategy at Google and Tableau at Jawbone. Google simultaneously experimented with Tableau and ClickView and MicroStrategy. There were people at Google who called MicroStrategy, “MicroDisaster” or “MicroSadly”, or (insert your own hilarious Micro explicative). The general consensus was, it actually may be a great solution, we are not sure, but WAY to difficult to implement and WAY complex for the typical user. Keep in mind, this was GOOGLE! Do you know the kind of people they hire? I’m wondering if anyone can figure out MicroStrategy? Sad indeed. In some regards, Tableau and ClickView is designed to be more accessible but get into the nuances of Tableau and ClickView you will see they are going down the same path. Arcane nuances. Sub menu within sub menu. Flip this little switch in submenu 24, under the heading of a new word we invented, and then the report will work right… Seriously, that is your solution? Tableau promotes this as a simple solution, FOR EVERYONE. I'm holding them to it. Do you know what Tableau Professional Support Services costs per hour? $250 per hour. I once spent $5000 in Tableau Professional Services to find out that if I changed the way the data looked to Tableau on the way in then everything would work right, if I didn’t there was no solution to my problem. Major question - who is going to design and support the right ETL to accommodate this thing I am buying, what does the solution really cost? Choose carefully. I can go down the line but the premise is about the same – you might save costs on software by leveraging the same software across functions of your organization, but you give that back on labor costs bending those applications into your functional needs and use cases. It may not be a silver bullet. It may not be cheaper. It may not be easy. The other non-obvious consideration here is that if you are lucky enough to have a Business Intelligence (BI) team - IT people who specialize in business reporting - HR may be able to get their attention briefly, but my experience has been that 4 out of 4 times they tire very quickly of this BI ignorant HR person and their silly needs. The BI people go away. They disappear. The result is a solution that never gets where you want it to go and ultimately doesn't work. If you are going to go in house, you need dedicated resources - dedicated resources cost money. You also have to find some really smart technology people who actually like and want to work on HR problems, rather than try to invent the next Facebook. Non coincidentally, this is extremely hard - even at Facebook. Especially at Facebook. ------------------------------------------------------------------------------------------ For the record, Tableau could be a great downstream data exploration and data visualization application, IF YOU HAVE a viable data warehouse and ETL solution in place for HR. ------------------------------------------------------------------------------------------ Option B : Specialized Analytical System For HR Here is a sample of options for varied HR Analytics purposes (alphabetical order) CultureAmp CruncHR Glint HiQ Lab One Model OrcaEyes Sapience Analytics SAP SuccessFactors Workforce Analytics (used to be InfoHRM) Visier ZeroedIn (Folks, feel free to add other HR Analytics applications to the comments of this post and I will edit this list to include those later) Maybe someday I will do a detailed analysis of all these applications – for now, here are the main questions you should ask as you evaluate each option? (you and your team need to answer these in a no-spin zone) What is the focus of this application? (consider depth, breadth, etc.) What are the 1-3 main differentiators of this application relative to its peers? What non-transparent assumptions underlay those 1-3 main differentiators? What other data management effort and applications are required to produce the final insights and stories that I am looking for? What is the data management strategy to get the data from your varied HR systems to these environments? (consider at the go live and consider the ongoing refresh) This was a Gloves Off Friday Post from Mike West Disclosures: Mike West is a bad man Mike West writes profusely about People Analytics Mike West now works for One Model
Read Article
Featured
11 min read
Mike West
To whom it may concern, For the last 2 years I am proud to have run my own People Analytics consulting company - PeopleAnalyst, which I like to call the first Independent People Analytics Design Company - but On January 1st - I will be joining One Model. These are the reasons: 1. I believe that People Analytics is important to the future of HR, the future of business and humanity – perhaps one of the most important business trends in our lifetime. I recently shared the principles I hold, supporting this thought here: Future of Human Resources - in 10 Principles. Beyond these principles I frequently try to point out that we had Accounting before we had Finance, we had Sales before we had Marketing, and we had HR (People Operations) before People Analytics. In every historical case, the application of an analytical framework to the pre-existing operational function revolutionized the way business was done, and those who were early to it were able to exploit lucrative informational advantages for a brief period of time before they became ubiquitous. In face of history these changes did not occur that long ago but today we think of them as always being the way they are. People Analytics’ time is now – in the future it will be “required for entry” in the big girls/boys club, but not be as much of an advantage as it is now. Pay very careful attention to the investment companies like Google and Walmart have made in People Analytics – 30+ people each and growing, going back several years. These companies are not stupid – this says something - they saw something. Google is “cleaning up” on the application of data to the People Operations/Talent area and in many markets they are a force to be reckoned with - nobody is anywhere near them on what they offer and how they do HR today. They are a steamroller. 2. Long ago I decided that my work - the application of data, math and science to HR - is my reason for being and part of the intentional legacy I want to leave on this earth. My commitment to what we now call People Analytics is unchanging - the key for me as I go through life is just figuring out where my efforts will have the most impact. I make moves when I sense the math on this has changed. Merck -> PetSmart -> Google -> (brief cross functional divergence)-> Children’s Medical ->PeopleAnalyst (my consulting company - the first People Analytics design company) -> now One Model… As we move forward, I think my area of greatest contribution will be to embed my unique way of approaching People Analytics into a technology environment, making it more realistic, accessible and affordable to more organizations. It is quite an amazing thing for a guy like me to have access to an engineering team with seed funding – I’m not going to pass on that opportunity. 3. Team - I believe in the magic of teamwork. I saw this video, which reminded me of what can be accomplished with teamwork : http://wapo.st/1U19g0M Gives me chills - the good kind. InfoHRM --> Success Factors --> SAP If you know anything about the history of enterprise reporting solutions for HR, the foundational predecessor to modern HR Analytics, you will find that the engineering team at One Model has a very interesting pedigree. Going back 10 years, the only system in this space was a little company called InfoHRM – they were out on the leading edge of HR reporting, essentially running a “cloud-like” solution before we even knew what the cloud was. InfoHRM was acquired by Success Factors (purportedly to help them crack the HR data reporting challenges they couldn’t seem to solve on their own), and then Success Factors was later gobbled up by SAP. I don’t know the whole behind the scenes story, but my general impression of what happens to people in these big company acquisitions is that how the product is perceived, the dynamics of working for an organization, as well as where you fit into all that really change. These guys fell out of that. When you see people who helped built a product category before anybody else was doing anything like it, who say now we are building something better knowing what we know now, you stop and listen - at least I do. TechStars – The Top 1% of Startups Another thing I really like about One Model is that they came out of the TechStars Accelerator program. Accelerators like TechStars are super-selective--less than 1% of applicants get in. You could say they are pickier than schools like Harvard, Stanford and MIT. In addition to direct assistance in getting the business model on the right track, and the well know “Pitch Day” TechStars alumni have access to a network of investors and advisors for life. In the Startup world, access to capital matters and access is primarily determined by your network. An element of this may seem like a superficial game of “who you know and who knows you”, but an element of this is ability to get to people who have been through good and hard times and can help you solve really difficult problems. Austin - SXSW and Food Trucks about say it all. These guys came to Austin to launch their company a little after the time that I did. I’m an HR guy – I’m all about culture and Austin has the right culture for me. Austin is hip (some call it weird), is second only Silicon Valley for startup community, in a US state that is friendly to business, has a lower cost of living than either US coast, and is well positioned geographically for US enterprise sales – 4 hours by plane to either coast and within driving distance or very short plane flight of 3 other major cities (Houston, Dallas and San Antonio). The startup community is tight-nit, collaborative and with a lot less of the showmanship and games you see in the Silicon Valley – I think a higher percentage of people here take creating a real business more seriously. Beyond these intangibles, when it comes to HR data, keep an eye on Austin, this is where it will be, there is some important stuff going on this space here right now. Mostly I just love Austin – it is an island of authenticity and creative energy unlike anywhere else. 4. Product Focus – oh where we can go together. One of the big mistakes I see in the field right now is that most people that are thinking at all about the space are thinking too narrowly. They think People Analytics is just one type of question, one type of data, or the application of a certain method of working with data. Let’s say prediction, for example. Examples include, how do you predict hires who will perform well in your environment or how do you predict what people are most likely to leave in a given time frame. However, some of these strike me as gimmicks - not standing up on real solid data - People Analytics is much more than this. For example - I have personally worked with data on HR on decisions involving how organizations select (staffing), onboard, pay (compensation), perk (benefits), the origins of happiness and motivation at work, quality of managers, employee commitment/turnover, performance, diversity, learning/training, time off policy, the relationship of HR outcomes to sales outcomes, etc… Others have worked on topics I haven't - the list continues. On top of the varied subject matter focus, you can focus on how you collect data, the tools to make data flow more efficiently, the methods you can use to analyze it, statistics, how you visualize the data, how you distribute to other people, etc. Any and all of these are potentially viable areas of business focus that you will see niche products in. As methods, machine learning algorithms and prediction are hot right now – all these are in our future, but we still have a lot of work to do on them. Here are the main perceptions I will offer on product focus at this time: People Analytics is eclectic, expansive and inclusive. In its essence, People Analytics is the systematic application of behavioral science and statistics to Human Resource Management to achieve probability derived business advantages. We need solutions that enable analysts to be better analysts, in the world of possibilities, not try to replace the analysts entirely. We need solutions that create more heros, not less. People tire quickly of gadgets and nobody wants to purchase and manage an ever-expanding assortment of gadgets (or only if they are all made by Apple). I’ll put it another way - One Model looks more like an aircraft carrier to me than a paper airplane. Organizations operate as holistic systems, therefore the answers to problems span across areas of specific functional responsibility, expertise and operational data stores. We have a lot of silos of data in HR – HR has undergone progressive advances through technology specialization and will continue to. The great irony is that the future of HR Analytics may be just reversed: synthesis, not specialization. The differentiating premise of One Model is synthesis. Many advantages will stem from this vantage point. If you care about synthesis of data in HR One Model should be on your short list. To do any analytics, simple or advanced (prediction, forecasting, optimization), accurately, regularly, in a timely and efficient way requires address of sprawling un-integrated operational HR data sources and process. One Model decided to start with the ‘data munging’ fundamentals and build from there. That doesn’t sound sexy and is a little more difficult initially to get the same kind of PR as a result, but it is important, and they will steadily deliver increasing value on that foundation, offers a lot of possibilities, and takes customers into the future in steps, not all at once. Imagine showing up to the board room with predictions about employees but you can't accurately answer or get to quickly any of the basic employee ins and outs questions. Begging the question - do you really know your workforce? What exactly do you do here anyway? I'm all for prediction and One Model is going there but don't over promise, really get to know HR data specifically, get the 'data munging' fundamentals right, organize more sources data more efficiently than anyone else, and delight and surprise the customer progressively as you go. I agree with this – I think it works better, fits the needs of today's HR function, and matches my practical MidWest (US) values. Cloud / Software as Service is here, is the future. The premise of One Model is that they can invest big in infrastructure and innovation on that infrastructure and distribute those gains to everyone. It only gets progressively better and more efficient over time. Why should every company invest in homegrown infrastructure for HR Reporting and Analytics independently? To reinvent all HR Analytics workflow internally at every organization is unrealistic for most organizations as it is a ludicrous business proposition. We no longer design our own homegrown HRIS systems today - why create and maintain our own technology infrastructure for HR Reporting and Analytics? I think 5-10 years from now we will look back and wonder why we used to do this at all, evoking the puckered sneers that “legacy HRIS solutions” get today. Don’t get me wrong - you should invest in ultra advanced or niche innovations in analytics unique to your business, in your environment, however in order to have the time and resources available to apply that kind of focus, you can rent everything else. How about getting on a platform that can speak to those applications and everything else. Like I do, these guys believe in "play nice with others" and that good guys do win too sometimes. You want to come along for the ride? ---------------------------------------------------------------------------------------- Who is Mike West? Mike has 10+ years of experience building People Analytics from the ground up at companies such as Google, Merck, PetSmart, Children's Medical, Jawbone and other places. Mike's passion is to develop thought leadership and to cross pollinate the frameworks and processes he helped develop and pioneer as an employee at these places. Mike spends most of his time teaching, coaching and writing on all things People Analytics.
Read Article
Featured
21 min read
Mike West
“Men* wanted for hazardous journey. Low wages, bitter cold, long hours of complete darkness. Safe return doubtful. Honour and recognition in event of success.” Serious question. If you stumbled on the employment ad above, would you respond? The ad is purported to have been posted by Ernest Shackleton to recruit people (“Men*”) for his Endurance expedition to the South Pole. There is some debate whether the ad was actually written by Shackleton, never-the-less whoever wrote it could get credit for compelling ad copy, as well as possibly the first example of a realistic job preview. *This was around 1914, and apparently whoever actually wrote this was unaware of woman’s interest in suffering in the workplace for 25% less than men! This reminds me of a pernicious glass ceiling and wage disparity problem – and now suddenly taking a little boat down to the South Pole doesn’t actually sound so challenging to me. Lots more I can say about the journey into problems of diversity and gender, but we will circle back to HR data and diversity another time. For now, let’s just continue with our juxtaposition of the cold, icy cold, Antarctic and HR Analytics, in general. A Brief Summary of The Expedition (Wikipedia): Endurance became beset in the ice of the Weddell Sea before reaching Vahsel Bay, and despite efforts to free it, drifted northward, held in the pack ice, throughout the Antarctic winter of 1915. Eventually the ship was crushed and sank, stranding its 28-man complement on the ice. After months spent in makeshift camps as the ice continued its northwards drift, the party took to the lifeboats to reach the inhospitable, uninhabited Elephant Island. Shackleton and five others then made an 800-mile (1,287 km) open-boat journey in the James Caird to reach South Georgia. From there, Shackleton was eventually able to mount a rescue of the men waiting on Elephant Island and bring them home. On the other side of the continent, the Ross Sea party overcame great hardships to fulfill its sub-mission. Aurora was blown from her moorings during a gale and was unable to return, leaving the shore party marooned without proper supplies or equipment. Nevertheless, the depots were laid, but three lives were lost in the process. Clearly the expedition failed to accomplish all of its objectives – yet it is recognized as an epic feat of leadership, endurance and one of the last of “great expeditions”. Some time ago I came across the ad, which tickles my sick sense of humor and my imagination. Who would take this job. I wonder - who wouldn’t take this job? In my mind I notice subtle similarities to the choices I have made in my career – which circles on the question how to change Human Resources, if not the direction of work as a whole, with data. Despite difficulty and a seeming complete lack of possibility for fame in this work, I am excited by what I do and I love every minute of it. I probably would have responded to the ad. I am also intrigued by others who take an extended interest in my strange field - partly because historically extended interest has been rare and partly because it is far from what most would consider a "rewarding career". I have been tracking new roles in HR Analytics, Talent Analytics, Workforce Analytics (what I call People Analytics) through Job Postings off and on for a long time. More recently, I have been searching for these people on LinkedIn. It is a question that begs to be answered – who are these people, what is their story, what do they care about, what do they want to achieve, how are they received within their HR organizations, what are they working on, what are their struggles? Those questions are a work in progress. For now, the fresh faces and backgrounds I see in these roles in pictures and Linkedin headlines is already truly inspiring to me. They surpass my imagination in magnitude and breadth and to me represent "pure energy". I think the people who are drawn to these roles are special and amazing. I have no doubt that as a collective they/we will ultimately work through the difficulties to reach our destination (Destination? This too is a great question). “Honour and recognition” hopefully forthcoming. In real life who would respond to an ad like Shackleton’s? I think a high proportion of those people today would be these People Analysts. The Vast Expanse of Human Resources The real HR is vastly underestimated and mostly lost on most people without direct exposure to a leadership role in HR in a large modern organization. By large I don't mean 500, I mean 50,000. HR doesn't really come into the spotlight until you reach 5,000 employees. At this point, you start to realize the inherent value in getting HR right - that is, if it is not too late for you! If you reach 50,000 employees you probably have figured HR out and now you preach it, never-the-less you are in a world of much bigger HR issues. If you are in a leadership position in HR for one of these companies few people know your precise struggles. It is a lonely world. If you could peer inside the little windows you would see that HR in large modern organizations is a complex network of technology, policy, process and people within discrete areas of specialization. HR can be grouped broadly in categories of Staffing (Sourcing, Recruiting, Onboarding), Total Rewards (Compensation & Benefits), Diversity, Talent Management and Organization Effectiveness (Performance Management, Succession Planning, Organization Design), Training and Development, Employee and Labor Relations, and HR Law & Govt. Compliance.It is a veritable alphabet soup of things to learn, with an entire language of its own. Sounds crazy, but yes, it is a diverse world of very real and very different things, all in this thing we call "HR". If you want to dive deeper you can find more granular silos of responsibility, but I will refrain from the arcane details and just leave it at the high level. I always forget - how many ways can eskimos describe snow?, and why? I don't know the answer, however I can surmise that they must spend a lot of time with the stuff. This is a good place to pause – are you still with me? Here is a more down to earth human story for you. Several times I have spoken to recruiters (an actual real life role in HR), and as I attempted to describe the complexity of tieing out data from many HR sources to some meaningful business conclusion and what that actually requires I have been interrupted with a reply something like - “It is nice you have had exposure to all those other things, but to be clear in this role you will deal exclusively with HR data. Are you o.k. with that?” With HR data? After only just having described a variety of HR data sources, not really even making the point about how these sources should be tied to business data, I am perplexed at who and what I am dealing with here. I am left to assume that by HR, they mean Staffing, a subset of HR and the rest of it is lost on them. Clearly, per their guidance it will also be lost from me and the organization if I take this job under these conditions. Do I correct them, or proceed? I am sorry to be blazingly disrespectful, but I have been tempted to stop right there and say, “Hello, Hello, are you still with me?" Should I proceed ahead or turn back? O.k., for now I will spare everyone the graduate level course in Human Resources but if you would like one they offer specialized programs in this stuff at the University of Minnesota, Cornell, Rutgers, University of Illinois, USC, somewhere in Michigan, and a few other schools! Some of the world's most respected CHROs of the largest organizations of the world came from those programs - I’d be happy to make introductions. By now you are rolling your eyes at me. No problem, I’m used to this. I am not sure what other people were doing when Shackleton was hacking away at the ice that was destroying the hull of his ship - I am sure his choices in life made a few eyes roll as well. Leadership and decision-making for HR sub-functions is distributed among members of the HR Leadership Team (HRLT). In large organizations the HR Leadership Team generally consists of 7-10 Sr. Directors reporting to a “Chief Human Resources Officer”(CHRO, SVP of HR, etc.). The Chief Human Resource Officer may have leadership, decision-making authority and budgetary control over the HR function, but in some cases the big budgetary decisions may go all the way up to the CEO and/or the board. Staff/Employees/Humans typically represent a spend of 50-85% of most modern organizations revenue. Usually over 75%. Stop - check it out in the annual reports, these days available on the internet. These are real numbers. Where is the profit? It is the little bit that is left over after all those people are done with their work and paid. The shareholders are buying a share in that little bitty slice. "Our most important asset". Have we looked at how we do HR with data - not much, not really - we know it is somewhere down there - it's a line item. In most advanced organizations the head of HR will report to the CEO, but often they report to the head of Operations or Finance – this varies by company. It is usually a sign post – a little piece of ice floating – be sure to keep an eye on it, something big may actually be under this ice. Regardless, wherever they report, it is contingent upon the CHRO to drive alignment of HR goals with “the business” and alignment between HR sub-functions on a central strategy and operational architecture for Human Resources. Seen as a “support function” annual HR goals and budgets typically FOLLOW development of business strategy and goals. HR goals are assembled in a last minute hurried manner sometime AFTER all the other business functions have had their shot at the podium. If those others strategies and goals are not yet fully clear, or the HR/People consequences cannot yet be ascertained with the information on hand, or the HR leadership team fails to communicate the people implications of the business goals, then HR will follow the business in a misaligned and disheveled path. In my experience, these are accurate actual descriptions of HR goals. Whether or not the HR team is able to formulate a coherent plan HR has last shot at organizational resources - consequently HR efforts are often unclear, underfunded and unrealistic (relative to the magnitude and difficulty of the goals). Examples of strategic HR objectives include: “Improve Employee Morale”, “Improve Employee Engagement”, “Reduce Employee Turnover”, “Change our Company Culture”... Hey, a big challenge excites me, but FYI these population level averages are resistant to dramatic changes. I’m not saying they don’t matter or can’t ever change – the polar icecaps are melting as we speak - I am saying moving these things in a different direction than they want to go requires serious attention to detail, resources, teamwork and commitment. It starts with data – let’s try to have a look at it together. Before Data, HR Systems Historically HR has vacillated between a single system ERP (Enterprise Resource Planning System) or HRIS (Human Resource Information System) that is average at everything or a series of “best of breed” applications that are better at a single purpose functions (Applicant Tracking Systems, Performance Management Systems, Compensation Systems, Learning Management Systems, Time and Labor Systems, etc…) Since HR CHROs and/or critical stakeholders in HR change over every 2-4 years you can guarantee exchange of favored technology to solve whatever shortfalls exist. Meanwhile, within a few years of implementing a solution we have new leadership and who prefer a newer, more optimistic path. Never wanting to get trapped in the ice, there is constant pull towards fragmented systems stemming from HR sub-function implementation of “best of breed” applications that align with HR sub-function operational objectives. If I am the head of Staffing and my team is tasked with improving Staffing, why would I settle for anything less than the best Applicant Tracking System? If I am the head of Compensation and my team is tasked with getting a grip on Total Rewards, why would I settle for anything less than the best Compensation Management System? So on and so forth. There is nothing wrong with this, however it results in a lack of technology optimization across sub-function silos and some important data consequences that must be addressed before or during reporting and analytics. Larger, established HRIS systems (PeopleSoft, Oracle, SAP, Lawson, and recently Workday), have more dependencies to worry about and thus are inherently slower to innovate than the collective of single purpose systems. The core systems will always lag in one or more sub-function operational areas. For example, even 10-15 years ago you could facilitate the Performance Management on SAP or Oracle HRIS, however Success Factors came along and offered a solution that was designed for this and consequently much better at it. So for 5-10 years Success Factors took the HR market for Performance Management, creating a hundred million dollar market and adding yet a new system category for HR to manage and integrate. Success Factors was later acquired by SAP (for 3.4 Billion) , and to this day can still be purchased by your organization as a stand-alone Performance Management application, with or without SAPs HRIS product. To this day most of the organizations I have worked with or for have NOT fully integrated Performance reporting between Success Factors and their HRIS. They prepare for the process to begin by manually setting and loading a file. There is someone deep within the bowels of the organization doing several spins on this. Probably also with a smart phone going off widely in the middle of dinner with the family - if this person even gets dinner with the family for a month or two of their life. Getting it out and rejoined to a changing organization and broader analytical purposes is another thing entirely. You will then discover, you can't export all the data you want directly from the same report, and there may not even be a common key! You might ask, “You mean to tell me we have no efficient method to join and report this data that is central to HR and to the business? Are you kidding me?" No. We have seen the same in Staffing, Learning Management, and Compensation. Compensation is especially bad - I like to call Compensation Planning (an annual event at every large company) a "planned emergency". 99.9% of the problems is embed in systems, 100% of it is caused by choices made by humans. By virtue of the many simultaneously moving fronts of sub-function innovation HR will undergo constant fragmentation and change in systems. Innovation in HR systems is good for us, but it is also very disruptive. Let’s Talk About HR Data Contrary to widespread belief HR actually has a lot of data, and a lot of good data. It is just locked away in systems not designed at all for reporting or analysis. Universally, HR systems are designed to facilitate operational processes and while they can perform their intended operational functions well (by design), they often do so at the expense of reporting and analysis. I will take a bucket of ice water over the head if someone shows me a single HR system that can perform a chi-square or binary logistic regression out of the box. How about any statistics beyond addition, subtraction, multiplication and division . We are waiting. Still waiting. Frankly, I like to keep my statistics and data visualization software options open – I don’t want my HR system to do everything for me, but I do need to be able to get the data out of my HR systems. Often even this seemingly basic task (get the data out) is difficult. Apparently, nobody thought of about getting the data out. If you are laughing, stop laughing. I’m not kidding. We can’t get deep insights or complete picture of the story until we get the data out of multiple systems, join it, and have a look at it through applications designed to work with data for reporting and analysis. Most HR “Analysts” are cobbling data together inefficiently with manual, undeveloped or broken data process. Some 80% of the effort of HR Analyst efforts are attributed to manual augmentation of non existent or inefficient data workflow. They spend very little time on actual science, statistical analysis and presentation of analysis. I have been an HR Analyst in one form or another for over 15 years, I speak to HR Analysts weekly, and I have seen the surveys. HR data are/is complex because people and organizations are complex and the sub-functions to support HR (as described at high level in above) are varied. There is hardly any similarity in reporting between Compensation and Benefits, let alone Staffing, Training or Employee Relations. Who owns Turnover reporting? Who owns the Employee Survey? These come from completely different systems, with different data structures and different metrics. The desired metrics can escalate into the hundreds and with variations in the thousands. Most HR metrics are compound and with dozens of potentially relevant dimensions to monitor. Let's take a look at a very commonly produced and seemingly simple HR metric: “Employee Attrition/Turnover”. This metric is formulated as a compound calculation (Time Period Exits / Time Period Average Headcount). Now before you shout "Eureka! - I have it - you just divide this number by this other number" keep in mind that you will need to calculate this by segments along multiple dimensions: location, division, business unit, tenure, grade/pay group, performance group, age, ethnicity, gender, etc.… You may have 50 locations. Some locations have 5 people and some have 15,000 people. What if you want division by location by gender. Also keep in mind that in annual form the denominator (average headcount) requires 12-13 or more data points for each of subset of each dimension, and these subset counts must all align in definition and time period consistently with the numerator. Also keep in mind that organizations can be expressed in different ways (people reporting relationships or cost center reporting relationship, which do not typically match), and that the only constant is that organizations are constantly changing. Imagine a data set that changes simultaneously along multiple dimensions over time and you are trying to report on this consistently over time to demonstrate a "trend" or "story" - how exactly does this work? If you are flabbergasted, don't worry, most of us People Analysts can work this problem in our sleep. This isn't even the hard stuff. I just described the calculation of one metric, now multiple this effort by 20 more metrics with data sitting across different systems and try to derive some meaning from that. Shouldn't we be analyzing this stuff together to tell a story, not independently? Most of the people doing good work on employee turnover have long ago moved on to more advanced ways of looking at exits (logistic regression, survival analysis, hazard curves, predictive models, etc.) and incorporate data from many many more data sources. I can ask employees three questions and based on their answer tell you if their chance of leaving in the next year is "about average, 2x average or 3x average" without knowing anything else about them. Give me their job and salary and tenure and a few other details and we can clean up on this problem. Let me join it to performance and compensation and now you can decide carefully how you want to distribute the limited budget you have to work with. Why not? If you are not working on this, what exactly are you working on? Speaking of clean up. For most organizations, you will get started on a seemingly simple problem, and bang into an inconsistency or perceived data quality problem, which requires people sitting in another sub-function you have no authority over to make a decision, fix a process, or change how they do something. This is not simple any longer – this is really difficult. You can circle on the same issues for years. Welcome to People Analytics, Are You Sure You Really Want This Job? Then There Were People HR professionals are not selected for a background in technology, math or science and so depend on people outside of HR for augmenting support in these capacities. Since HR is seen by “the business” as lower in priority than other business functions (software engineering, finance, marketing, sales, etc.), and therefore not a very prestigious appointment for anyone - IT and data science support suffers. The head of HR is on the phone about something they want and the head of Sales, which call should I take? If HR is already underfunded for its goals, HR IT is even worse, not receiving adequate talent pool or funding. Budgets are divided among the heads of functional HR silos and so no unifying technology solution can be reached. There are simply too many different jobs to be done in too many different systems. Data architecture for this is an after-thought. Each Sr. Director will compete for the time, attention and resources of whatever HRIS or HR Analytics professionals exist to try to achieve their objectives first (at any and all cost to others –their performance rating is on the line!). If they cannot get that attention they want from IT or from their HR Analysts they will try to go outside the organization for the support. Talk about being crushed in the ice. You will be loved and despised, sometimes consecutively, sometimes simultaneously. How this can happen is one of the great mysteries of the universe - outside of light being possible to be described both as a particle and wave. It is sort of like the possibility of dying of thirst while standing on water (ice). Input fire and maybe drown. Now I am being dramatic. This is just to say that I think Shackleton and company did a pretty good job, given the odds. I have had opportunity to be in these data oriented HR roles for variety of very well respected organizations (Merck, PetSmart, Google, one of the best Children’s Hospitals in the country to name a few) - believe me when I say these HRIS and HR Analytics folks have many more “bosses”, a combined list of objectives much longer, technically more complicated, and with less resources than anyone else. There high turnover among HRIS and HR Analytics professionals for a reason. You got one. Great. I hope you have a backup plan! Other People Analytics posts by Mike West ---------------------------------------------------------------------------------------- Who is Mike West? Mike has 10+ years of experience building People Analytics from the ground up at companies such as Google, Merck, PetSmart, Children's Medical, Jawbone and other places. Mike's passion is to develop thought leadership in HR and to cross pollinate the frameworks and processes he helped develop and pioneer as an employee at these places. Mike spends most of his time teaching, coaching and writing on all things People Analytics.
Read Article
Featured
2 min read
Chris Butler
The One Model team has a huge amount of experience in the HR data and analytics field. Our careers started at Infohrm, the world’s first SaaS workforce analytics provider. Infohrm was acquired by SuccessFactors in 2010 and we later moved into SAP with their acquisition of SF in 2012. As a result we’ve worked with more customers across more data sources than just about anyone else in the world. Customers with 200 employees right through to 600,000 employee behemoth organizations. This experience has earned us a unique perspective on how organizations currently use their people data, how they could be using their data in a perfect world and the amount of supporting technology that is available to them. We’ve learned that data and the correct management of it, is the real key to organizations becoming successful with their talent analytics programs. Every company I have ever met struggles with their HR data. Visualization tools are a red herring to true capability without a properly constructed and maintained method for bringing together all of your HR technology data. It will give you some early wins but you’ll soon outgrow the offered capability with nowhere else to go. Analytics, planning, and even application integration should flow as a natural byproduct of a well-executed data strategy. This is what we bring to our customers with One Model. All of your HR technology data brought together in a single unified source, automatically organized into expert built data models ready for intelligence and to support any other use case. With all of your data together regardless of the source the opportunities for using this data, for choosing better business software, and interaction between data sets become limitless. Our passion is for this data set and the HR challenges we can solve with it. We have always wanted to be able to build without restriction, the tools to collect data, to build the calculations, algorithms, and thought leadership initiatives we know our customers want. One Model is architected exactly for that, highly automated, flexible, intuitive, and open to use any other toolset you may have already invested in or want to invest in. Easily use tableau, qlikview, excel, and successfactors workforce analytics. See how we compare to the competition. We are looking for more great customers to come on board and help us refine our roadmap and prioritize capabilities important to you. On premise or cloud sources we’re ready to onboard your data and give you complete control, please contact me if you would like to join our customer engagement program.
Read Article
Featured
6 min read
Stacia Damron
It’s sounds ridiculous, but it’s true. According to the New York Times, 4.2% of women held CEO roles in America’s 500 largest companies. Out of those same 500 companies, 4.5% of the CEO’s were named David.* While shocking, unfortunately, it’s not incredibly surprising. Especially when a whopping 41% of companies say they’re “too busy” to deploy diversity initiatives. But for every company out there that’s “too busy”, there are plenty of others fighting to get it right. Take Google, for example. In 2016, Google’s tech staff (specifically tech roles - not company-wide roles) was 1% Black, 2% Hispanic, and 17% women. They announced a plan to invest 150 million in workforce initiatives. The tech staff is now 2.5% Black and 3.5% Hispanic/Latinx, and 24.5% female, according to their 2018 diversity report. So what does that mean? It means that even the brightest and most innovative companies have their work cut out for them in regards to improving diversity. Change doesn’t happen overnight. Diversity breeds innovation; a diverse talent pool leads to diverse ideas. Get this; a Forbes article touts that transitioning a single-gender office to a team equally comprised of men and women would translate to 41% in additional revenue. “Metrics” (which is just a fancy word for data btw) don’t lie. It’s important to set, track, and monitor workforce diversity goals - especially when we have more tools than ever at our disposal to do so. Over the past few years, here at One Model, we've seen a huge push for placing a priority on monitoring diversity metrics. In 2016, a Fortune 100 financial services organization, Company X (name anonymized) selected One Model’s platform to measure and monitor company-wide trends in diversity data and metrics. As their people analytics and workforce planning solution, One Model allowed them to not only better report on their data - but also more easily track and monitor changes, determine key KPIs, and see how improvements they’re making internally are affecting the data. More Accurate Data = Better Reporting. During Company X's transition from SAP to Workday, they used One Model to retrieve and migrate survey data. This platform allowed them to combine and normalize the data from several sources, enabling the team to report off of it as one source. The successful migration provided the HR team with the recovered data and prevented the team from having to redeploy the survey, allowing them to more accurately reflect their current diversity metrics and progression towards goals. This was a win. Here’s the challenge: When pulled together, the data referenced above indicated that out of several thousand employee responses, a number of employees failed to select or identify with one of the given race selections. This represented a sizeable portion of the employees. One Model’s software helped them identify this number. Once they realized this, they realized they had an opportunity to setup other processes internally. They did just that - which helped identify 95% of the employees who fell within that group, obtaining vital missing data that raised the percentage of diversity within the organization. Determining Key KPIs and Measuring Improvements Furthermore, Company X used the One Model platform to identify and reward the departments that successfully hit their recruitment-based diversity goals. This allowed the team to survey these departments and identify the hiring trends and best practices that led to these improved diversity metrics. By identifying specific process and KPI’s surrounding these diversity metrics, departments that successfully met their goals could share recruiting tactics and best practices to ensure appropriate actions were taken to maximize diversity throughout the whole of the recruiting pipeline. Company X is currently implementing these processes and working towards replicating a similar outcome amongst other departments in need of workforce diversity improvement. Tracking and Monitoring Changes Last but not least, Company X wanted more visibility into why females had a lesser presence in managerial roles within the organization. While, male to female promotions were equal. (This past year, 32 people were promoted. 55% of promotions (16 people) were women), there were significantly more males than females in managerial roles. Upon reviewing the data, they learned that out of the company’s requisitions, females applicants only made it to certain stages within the interview process (namely, an in-person interview) 50% of the time. Half the time, the only applicants that made it to a particular stage were male. They determined a hypothesis surrounding a particular KPI - that if more females made it to this particular stage, the odds were higher that more females would fill these roles. Company X set a goal that they wanted a female candidate make it to a manager interview stage 80% of the time. They are testing different methods on how best to achieve this, and with One Model's help, they are able to measure the effectiveness of those methods. By providing this visibility, One Model’s platform is currently helping them monitor their progress towards this goal, and allows them to see the affect - the direct impact on numbers of M/F managers in real-time. Company X is one of the many companies that has realized and embraced the importance of diversity in workforce planning. We’re confident they’ll eventually hit their goals, and we’re proud to be a part of the solution helping them do so. Is your company ramping up it’s People Analytics Program or diving into workforce diversity initiatives? One Model can help you better view and report on the data associated with your diversity goals. Here are just a few of the top metrics companies are currently focusing on: Recruitment Metrics Representation Metrics, such as: Minorities / URMs Veterans Women IWDs Staffing/Placement Metrics Transaction Metrics Training Metrics, such as: Penetration of diversity-related training, general training participation rates, and demographics of talent pipeline Advancement Metrics External Diversity Metrics Culture / Workplace Climate Metrics *based on 2016 NYT data. Want to see what One Model can do for you? Scheduled some time to chat with a One Model team member. About One Model: One Model provides a data management platform and comprehensive suite of people analytics directly from various HR technology platforms to measure all aspects of the employee lifecycle. Use our out-of-the-box integrations, metrics, analytics, and dashboards, or create your own as you need to. We provide a full platform for delivering more information, measurement, and accountability from your team.
Read Article