If people analytics teams are going to control their own destiny they're going to need to need to support the enterprise data strategy. You see, the enterprise data landscape is changing and IT has heard its internal customers. You want to use your own tools, your own people, and apply your hard won domain knowledge in the way that you know is effective. Where IT used to fight against resources moving out of their direct control they have come to understand it's a battle not worth fighting and in facilitating subject matter experts to do their thing they allow business units to be effective and productive.
The movement of recent years is for IT to facilitate an enterprise data mesh into their architecture where domain expert teams can build, consume, and drive analysis of data in their own function...so long as you can adhere to some standards, and you can share your data across the enterprise. For a primer on this trend and the subject take read of this article Data Mesh - Rethinking Enterprise Data Architecture
The diagram heading this blog shows a simplified view of a data mesh, we'll focus on the people analytics team's role in this framework.
A data mesh is a shared interconnection of data sets that is accessible by different domain expert teams. Each domain team manages their data applying its specific knowledge to its construction so it is ready for analytics, insight, and sharing across the business. When data is built to a set of shared principles and standards across the business it becomes possible for any team to reach across to another domain and incorporate that data set into their own analysis and content. Take for example a people analytics team looking to analyze relationships between customer feedback and front-line employees' attributes and experience. Alternatively, a sales analytics team may be looking at the connection between learning and development courses and account executive performance, reaching across into the people analytics domain data set.
Data Sharing becomes key in the data mesh architecture and it's why you've seen companies like Snowflake do so well and incumbents like AWS bring new features to market to create cross-data cluster sharing.
One Model was strategically built to support your HR data architecture. If you'd love to learn more, check out our people analytics enterprise products and our data mesh product.
The trend to the mesh is growing and you're going to be receiving support to build your people analytics practice in your own way. If you're still building the case for your own managed infrastructure then use these points for helping others see the light and how you are going to support their needs.
I'm sure you've butted heads against this already but identify if the organization is supportive of a mesh architecture or you'll have to gear up to show your internal teams how you will give them what they need while taking away some of their problems. If they're running centralized or in a well-defined mesh, you will have different conversations to obtain or improve your autonomy.
People analytics teams are going to be asked to contribute to the enterprise data strategy if you are not today. There are a number of key elements you'll need to be able to do this.
This trend to the data mesh is ongoing, we've seen it for a number of years and heard how IT thinks about solving the HR data problem. The people analytics function is the domain expertise team for HR, our job is to deliver insight to the organization but we are the stewards of people data for our legacy, current, and future systems. To do our jobs properly we need to take a bigger picture view of how we manage this data for the greater good of the organization. In most cases, IT is happy to hand the problem off to someone else whether that's an internal team specialized in the domain or an external vendor who can facilitate
It won't surprise you to hear but we know a lot about this subject because this is what we do. Our core purpose has been understanding and orchestrating people data across the HR Tech landscape and beyond. We built for a maturing customer that needed greater access to their data, the capability to use their own tools, and to feed their clean data to other destinations like the enterprise data infrastructure and to external vendors. I cover below a few ways in which we achieve this or you can watch the video at the end of the article.
Off the shelf integration products and the front end tools in most HRIS systems don't cater for the data nuances, scale of extraction, or maintenance activities of the source system. Workday for example provides snapshot style data at a point in time and it's extraction capabilities quickly bog down for medium and large enterprises. The result is that it is very difficult to extract a full transactional history to support a people analytics program without arcane workarounds that give you inaccurate data feeds. We ultimately had to build a process to interrogate the Workday API about dozens of different behaviors, view the results and have the software run different extractions based on its results. Additionally most systems don't cater for Workday's weekly maintenance windows where integrations will go down. We've built integrations to overcome these native and nuance challenges for SuccessFactors, Oracle, and many other systems our customers work with. An example of a workday extraction task is below.
Our superpower. We've built for the massive complexity that is understanding and orchestrating HR data to enable infinite extension while preserving maintainability. What's more it's transparent, customers can see how that data is processed and it's lineage and interact with the logic and data models. This is perfect for IT to understand what is being done with your data and to have confidence ultimately in the resulting Analytics-Ready Data Models.
Your clean, connected data is in demand by other stakeholders. You need to be able to get it out and feed your stakeholders, in the process demonstrating your mastery of the people data domain. One Model facilitates this through our Data Destination's capability, which allows the creation and automated scheduling of data feeds to your people data consumers.
Feeds can be created using the One Model UI in the same way as you may build a list report or an existing table and then just add it as a data destination.
We've always provided customers with the option to connect directly to our data warehouse to use their own tools like Tableau, Power BI, R, SAC, Informatica, etc. Our philosophy is one of openness and we want to meet customers where they are, so you can use the tools you need to get the job done. In addition to this a number of customers host their own AWS Redshift data warehouse that we connect to. There's capability to run data destinations to also feed to other warehouses or use external capability to sync data to other warehouses like Azure SQL, Google, Snowflake etc. A few examples
Snowflake - https://community.snowflake.com/s/article/How-To-Migrate-Data-from-Amazon-Redshift-into-Snowflake
Azure - https://docs.microsoft.com/en-us/azure/data-factory/connector-amazon-redshift
With One Model all metric definitions are available for reference along with interactive explanations and drill through to the transactional detail. Data governance can be centralized with permission controls on who can edit or create their own strategic metrics which may differ from the organizational standard.
We provide standard content tailored to the customers own data providing out of the box leverage for your data as you stand up your people analytics programs. Customers typically take these and create their own storyboards strategic to their needs. It's straightforward to create and distribute your own executive, HRBP, recruiting, or analysis project storyboards to a wide scale of users. All controlled by the most advanced role based security framework that ensures only the data permissioned can be seen by the user while virtually eliminating user maintenance with automated provisioning, role assignment, and contextual security logic where each user is linked to their own data point.