Many organizations are struggling to keep up with the constantly changing legislative, regulatory, and security requirements surrounding artificial intelligence (AI) as the technology continues to rapidly evolve, according to a new industry resource from global HR research and advisory firm McLean & Company. The research suggests that as organizations are rushing to implement AI solutions to gain a competitive advantage, establishing AI strategy and governance is often an afterthought. Despite the crucial role HR plays in establishing internal systems to foster organizational culture and catalyze digital change, it is often excluded from strategic planning. To help organizational and HR leaders develop responsible AI guiding principles, enable successful AI adoption, and mitigate risks, McLean & Company has released its new guide, Develop Responsible AI Guiding Principles.
“Many organizations are not implementing responsible AI guiding principles, which is a critical step in AI governance and the broader AI strategy,” says Lisa Highfield, principal director, Human Resources Technology and Artificial Intelligence, at McLean & Company. “HR leaders have not only an opportunity but a responsibility to help prepare the organization for change and help secure the success of AI adoption as we progress into the future of work.”
In the new HR resource, the firm explains that AI will transform systems, processes, people, and organizations with increasing velocity, breadth, and depth. Introducing AI technologies can create tension within organizations, as the people who make up an organization often hold differing views and competing priorities.
However, organizational and HR leaders coming together to build aligned guiding principles can significantly aid in the success and management of AI technology. To support a collaborative approach to developing responsible AI guiding principles, McLean & Company has created a three-step process for organizational and HR leaders. The steps are outlined below:
- Step 1: Gather key information.
Establish an AI governance committee to discuss organizational AI strategy and assess risks associated with AI adoption and use, including legal, regulatory, ethical, and compliance considerations. Next, gather related inputs to help inform the AI guiding principles. - Step 2: Identify and draft responsible AI guiding principles.
Research and draft AI guiding principles for the unique needs of the organization that clearly define the rationale and implications of each principle. - Step 3: Communicate and iterate.
Put into action the next steps, such as developing AI policies and communicating across employee groups. Finally, evaluate, maintain, update, and reinforce the responsible AI guiding principles actively within the organization.
McLean & Company advises organizational and HR leaders that there is no single source of truth on a specific organization’s responsible AI guiding principles. Rather, it is important to consider industry, regulatory, legal, compliance, and organizational factors, which may change over time. The firm therefore advocates that a cross-functional team, including HR, be formed to review AI governance and AI guiding principles. HR is well positioned to strategically support the organization’s development of responsible AI guiding principles and policies given its understanding of ethical, legal, and people-focused risks that can accompany AI adoption.
Explore HRtech News for the latest Tech Trends in Human Resources Technology.