target audience: TECH BUYER  Publication date: May 2024 - Document type: IDC Perspective - Doc  Document number: # US52060524

AI Attestation Services: How Do You Know Your AI Models Perform as Expected

By:  Philip D. Harris, CISSP, CCSK Loading

Content



Get More

When you purchase this document, the purchase price can be applied to the cost of an annual subscription, giving you access to more research for your investment.



Related Links

Abstract


This IDC Perspective discusses artificial intelligence (AI) attestation services and how business can determine if their AI models are performing as expected. Corporations are investing significantly in artificial intelligence since generative AI (GenAI) became real, usable, and value added. Organizations could see the major potential benefits in the areas of customer satisfaction, new and/or enhanced products, customer growth, and revenue growth. With this come the risks AI can present if AI models are inadequately developed, evaluated, assessed, implemented, maintained, and used for business decisions.

There is the potential need for independent, third-party AI attestation or peer-review service providers that can provide that added level of trust and certainty that the risks of implementing certain AI models are manageable.

"AI is here to stay, and it is only going to become more sophisticated for business use. We now have an opportunity in these nascent stages to drive best practices in the development, management, maintenance, and enhancement of AI for critical business needs," says Philip Harris, research director, Governance, Risk, and Compliance Services and Software at IDC. "By taking these initial steps carefully and thoughtfully, the trustworthiness of AI will grow and organizations will significantly reduce risk to contend with going forward."



Coverage


Do you have questions about this document
or available subscriptions?