DRAFTINTERNATIONALSTANDARD ISO/IECDIS42005
systemimpactassessment InformationtechnologyArtificialintelligenceAl
ICS: 35.020
THIS DOCUMENT IS A DRAFT CIRCULATEDTHEREPORE SUBJECT TO CHANGE AND MAY NOT BE REFERRED TO AS AN INTERNATTONAL FORCOMMENTANDAPPROVALITISSTANDARD UNTIL PUBLISHED AS SUCH
IN ADDFTION TO THEIR EVAUUATION ASBEING ACCEPTABLE FOR INDUSTRIAL TECHNOLOGICAL COMMERCIALANDUSER PURPOSES DRAFT INTERNATIONAL STANDARDS MAY ON OCCASION HAVE TO BE CONSIDERED IN THE LIGHT OF THEIRPOTENTIAL TO BECOME STANDARDS TO WHICH REFERENCE MAY BE MADE INNATIONAL REGULATIONS.
RECIPIENTS OF THIS DRAFT ARE INVITEDTO SUBMIT WITH THEIR COMMENTS RIGHTS OF WHICH THEY ARE AWARE AND TO NOTIFICATION OF ANY RELEVANT PATENTPROVIDE SUPPORTING DOCUMENTATION.
COPYRIGHTPROTECTEDDOCUMENT
① ISO/IEC 2024
Allrights reserved. Uless otherwise specified or required in the context of its imlementation o part of this publication maybe reproduced or utilized otherwise in any form or by any means electronic or mechanical including photocopying or posting on the intermet or an intranet without prior written permission Permission can be requested from either ISO at the address belowor ISO's member body in the country of the requester.
ISO copyright officeCH-1214 Vernier Geneva CP 401 • Ch. de Blandonnet 8Email:copyright@ Phone: 41 22 749 01 11Website: in Switzerland
14 Contents
15 Foreword.. V16 Introduction. vi17 1 Scope. .118 2 .119 3 Terms and definitions .120 21 4.1 General. 4 .2 .222 4.3 Integration with other organizational management processes. 4.2 Documenting the process. .323 24 4.4Timing of AI system impact assessmen.. .3 .325 4.5 Guidance for determining the scope of the AI system impact assessment. .426 27 4.6 Allocating responsibilities. 4.7 Establishing thresholds for sensitive uses prohibited uses and impact scales.28 4.8 Performing the AI system impact assessment. 4.9 Analysing the results of the AI system impact assessment. .529 30 4.10 Recording and reporting .631 32 4.12 4.11 Approval process.. Monitoring and review. .633 Documenting the Al system impact assessment.34 5.1 General35 36 5.2 Scope of the AI system impact assessment. 5.3 AI system information.37 5.3.1 Al system description. .738 39 5.3.2 5.3.3 Al system features. Al system purpose. .8 .840 5.3.4 Unintended uses. Intended uses.. ..841 42 5.4 Data information and quality. 5.3.5 ..9. ..43 5.4.1 5.4.2 General. Data information.. .944 45 5.4.3 Data quality documentation. 10 ..9.46 47 5.5 Algorithm and mo del information. 5.5.1 General. .10 1048 5.5.2 Information on algorithms used by the organization. 1049 50 5.5.3 5.5.4 Information on models used in an AI system. Information on algorithm development. .10 .1151 5.5.5 Information on model development. .1152 53 5.6 Deployment environment. 5.6.1 Geographical area and languages. .11 1154 5.6.2 Deployment environment plexity and constraints. 1255 56 5.7 Relevant interested parties. 5.8 Actual and potential impacts .13 1257 5.8.1 General. Benefits and harms. 1359 58 5.8.2 5.8.3 AI system failures and misuse or abuse. 16 .134/10 60 5.9 Measures to address harms and benefits 1761 Annex A (informative) Guidance for use with ISO/IEC 42001 .1862 Annex B (informative) Guidance for use with ISO/IEC 23894. 21
63 B.1 General. .2164 B.2 Differences between risk management and AI system impact assessment. .2165 B.3 Risk management principles related to AI system impact assessment . .2166 Annex C (informative) Harms and benefits taxonomy . .2367 .2568 D.1 Introduction.... .2569 D.2 Coordination guide .. .2570 D.3 Impact asse ssment alignment guide. .2671 D.4 Mapping guide. .2772 Annex E (informative) Examples of AI system impact assessment templates. .2973 E.1 General .2974 75 E.2 Example AI system impact assessment template AI system impact assessment for [system name or identification]. .29 .2976 SectionA-System information .2977 78 Section B -Data information and quality. Section C - Algorithms and models information... .31 .3179 Section D - Deployment environment.... .3208 81 Section E -Relevant interested parties... Section F - Actual and potential benefits and harms... .3382 Section G - AI system failures and misuse or abuse. 3483 Bibliography.. .3584
85 Foreword
86 ISO (the International Organization for Standardization) isa worldwide federation ofnational standardsthrough ISO technical mittees. Each member body interested in a subject for which a technical
87 88 bodies (ISO member bodies). The work of preparing International Standards is normally carried out89 mittee has been established has the right to be represented on that mittee. International90 91 organizations governmental and non-governmental in liaison with ISO also take part in the work ISO collaborates closely with the International Electrotechnical Commission (IEC) on all matters of92 electrotechnical standardization.93 94 described in the I/IEC Directives Part 1n particular thedifferentapproval criteria needed forth The procedures used to develop this document and those intended for its further maintenance are95 different types of ISO documents should be noted.This document was drafted in accordance with the96 editorial rules ofthe ISO/IEC Directives Part 2 (see 98 patent rights.ISO shall notbeheldresponsible foridentifying anyorall such patent rightsDetails ofany99 patent rights identified during the development of the document will be in the Introduction and/or on100 the ISO list of patent declarations received (see 101 Any trade name used in this document is information given for the convenience of users and does not constitute an endorsement.For an explanation of the voluntary nature of standards the meaning of ISO specific terms andTrade Organization (WTO)principles in the Technical Barriers to Trade (TBT) see expressionsrelated to conformity assessment as well as information aboutISO's adherence tothe World document was prepared by Technical Committee ISO/IEC JTC 1 Information technology Submittee SC 42 Artificial intelligence.Any feedback or questions on this document should be directed tothe user's national standards body. A plete listing of these bodies can be found at
103105 104106
107108
109 110