IEEE Standard offers 6 steps for supplying the AI system

For more than three years, a working group from the IEEE Standards Associations has refined a standard for the purchase of artificial intelligence and automated decision-making systems, IEEE 3119-2025. It is intended to help supply teams to identify and manage risks in high -risk areas. These systems are used by government entities involved in education, health, employment and many other areas in the public sector. Last year, the working group joined an agency in the European Union to assess the components of the standard project and to collect information on the needs of users and their point of view on the value of the standard.
At the time, the standard included fiveProcess to help users develop their requests and identify, mitigate and monitor damage generally associated with high -risk AI systems.
These processes were the definition of the problem, the supplier assessment, the evaluation of solutions, the negotiation of contracts and the monitoring of contracts.
The comments of the EU agency led the working group to reconsider the processes and the sequence of several activities. The final project now includes an additional process: preparing for solicitation, which comes just after the problem for defining the problem. The working group considers that the additional process is part of the challenges that organizations have encountered in the preparation of specific solicitations for AI, such as the need to add transparent and robust data requirements and incorporate questions concerning the maturity of the governance of the supplier’s AI.
The EU agency has also stressed that it is essential to include the preparation of solicitation, which gives the supply teams additional opportunities to adapt their requests with technical requirements and questions concerning the choices of responsible AI systems. Leaving space for adjustments is particularly relevant when AI acquisitions occur in emerging and changing regulatory environments.
Gisele waters
IEEE 3119 place in the standards ecosystem
There are currently several standards accepted internationally to IA management,, Ethics of AIand general software acquisition. Those of the IEEE and International Organization for Standardization Target the design, use and management of the life cycle.
Until now, there has not been any internationally accepted consensus standard which focuses on the purchase of AI tools andResponsible purchase advice High risk AI systems which serve the public interest.
The IEEE 3119 standard fills this gap. Unlike the ISO 42001 IA management standard and other certifications related to generic AI and risk governance, the new IEEE standard offers a Risk based, operationalapproach To help government agencies to adapt traditional supply practices.
Governments have an important role to play in the responsible deployment of AI. However, market dynamics and the uneven IA expertise between industry and government can be obstacles that discourage success.
One of the main objectives of the standard is to better inform purchasing leaders about what they buy before making high risk purchases of AI. IEEE 3119 defines high -risk AI systems such as those that manufacture or are a substantial factor in manufacturing substantial decisions which could have significant impactson people, groups or society. The definition is similar to that used in Colorado 2034 You have a documentThe first American law at the level of the state addressing high -risk high -risk systems.
The standard processes, however, complete the ISO 42001 in many ways. The relationship between the two is illustrated below.
International standards, often characterized as laware used to shape the development of AI and Encourage international cooperation concerning his governance.
Hardly laws for AI, or the legally binding rules and obligations, are a work in progress in the world. In the United States, a patchwork of state legislation governs different aspects of AI, and the approach of national AI regulations is fragmentedWith different federal agencies implementing their own directives.
Europe has led by the success of the European Union You have a documentWho started governing AI systems according to their risk levels when he entered into force last year.
But the world is missing Hard regulatory laws with international scope.
The IEEE 3119-2025 standard is aligned with existing hard laws. Due to its emphasis on supply, the standard supports high -risk provisions described in the Chapter III of the EU AC And Colorado’s Ai Act. The standard also complies with Texas HB 1709 The legislation, which would oblige reports on the use of AI systems by certain commercial entities and state agencies.
Because Most AI systems used in the public sector are purchased rather than built internally, IEEE 3119 applies to commercial products and services that do not require Substantial changes or customizations.
The target audience of the standard
The standard is intended for:
- Intermediate level purchasing professionals and the members of the interdisciplinary team with a moderate level of AI governance and knowledge of the AI system.
- Public and private shopping professionals who serve as coordinators or buyers, or have equivalent roles within their entities.
- Managers and non-procedure supervisors who are either responsible for supply, or supervise the staff who perform supply functions.
- Professionals who are employed by governing entities involved in public education, public services, transport and other services funded by the public who work or manage purchases and wish to adapt the purchasing processes for AI tools.
- AI providers seek to understand new transparency and disclosure requirements for their high -risk commercial products and solutions.
Work training program
THE IEEE Association of Standards has established a partnership with the AI purchase laboratory to offer it IEEE Responsible IA Proceeding Training Program. The course covers how to apply the main processes of the standard and adapt current practices for the purchase of a high risk AI.
The standard includes more than 26 tools and sections on the six processes, and the training program explains how to use many of these tools. For example, training includes instructions on how to carry out a risk-apptite analysis, apply the supplier assessment guide to analyze Complaint for AI sellersAnd create a “risk register” of IA supply linked to the identified usage risks and their potential attenuations. The training session is now available for purchase.
It is still early for the integration of AI. The decision -makers do not yet have much experience in buying AI for high -risk areas and in the attenuation of these risks. The IEEE 3119-2025 standard aims to support agencies Build and strengthen their AI Risk attenuation muscles.
From your site items
Related items on the web



