ETSI’s consumer IoT cybersecurity ‘conformance assessments’: parallels with the AI Act
In early September 2021, the European Telecommunications Standards Institute (ETSI) published its European Standard to lay down baseline cybersecurity requirements for Internet of Things (IoT) consumer products (ETSI EN 303 645 V2.1.1). The Standard is a recommendation to manufacturers to develop IoT devices securely from the outset. It also provides an internationally recognized benchmark – informed by several external sources, such as the interactive tool of the European Union Agency for Cybersecurity (ENISA) – to assess whether such devices have a minimum level of cybersecurity.
ETSI’s Standard takes an outcomes-focused approach, providing manufacturers with the flexibility to implement the most appropriate security solutions for their products instead of being overly-prescriptive in terms of specific security measures. It aims to protect IoT devices against elementary attacks on fundamental design weaknesses, such as the use of easily guessable passwords.
A technical ETSI specification complements the Standard, providing manufacturers, developers, suppliers, and implementers with a methodology to conduct conformance assessments of their IoT products against the baseline requirements (ETSI TS 103 701 V1.1.1). According to the German Federal Office for Information Security (BSI), the specification “ensures that test results are comparable to the security characteristics of IoT devices. In this way, IoT-experienced persons are enabled to make a corresponding security assessment.” Testing according to the specification may serve as a pathway for manufacturers and providers to obtain security labels on their products, such as Germany’s IT Security Label, a process which the BSI opened for applications in December 2021 for two categories of products: broadband routers and email services.
Moreover, the framework published by ETSI may come to serve as the basis for a technical specification that the European Commission has requested ETSI to develop for “internet-connected radio equipment” under the Radio Equipment Directive (RED) and its recent Delegated Act, notably on what concerns the incorporation of network security, privacy, data protection, and fraud prevention features.
In this piece, we first summarize the Standard’s narrow approach to data protection requirements (Section 1). We then describe ETSI’s conformance assessment methodology (Section 2). Lastly, we explore whether we can draw a parallel or find synergies with the “conformity assessment” in the proposed AI Act (Section 3). In answering this question, this blogpost deems to highlight how the ETSI’s overall framework for IoT security conformance assessments compare with the AI systems conformity assessment requirements laid down in the European Commission proposal for an AI regulation. Such analysis is particularly relevant in light of the standardization request that the Commission has made to ETSI to operationalize certain requirements of the AI Act, and the latter’s calls for leaving important parts of the regulation – like the definition of Artificial Intelligence itself and the list of high-risk use cases – to technical standards.
This piece follows a previously published deep dive analysis of the “conformity assessment” in the AI Act and how it compares to the Data Protection Impact Assessment provided by the GDPR.
1. A narrow approach to data protection requirements
The Standard starts by non-exhaustively listing the categories of IoT products to which it applies. It is focused on consumer IoT devices that are connected to network infrastructure (such as the Internet or a home network) and their interactions with associated services, such as connected children’s toys and baby monitors, smart cameras, TVs and speakers, wearable health trackers, connected home automation systems, connected appliances (e.g., washing machines and fridges), and smart home assistants. Products that are primarily intended to be used in manufacturing, healthcare, or other industrial applications are excluded.
We remark that the references to personal data protection in ETSI’s framework are somewhat limited, even if it covers consumer-facing IoT products. The Standard takes a more comprehensive approach to device and information security. Regarding the security, lawfulness, and transparency of personal data processing through consumer IoT devices, the Standard recommends manufacturers to focus on deploying appropriate best practice cryptography to transfers of personal data between the device and associated services (e.g., cloud). On this point, particular emphasis is given to cases where the processed data is sensitive (e.g., video streams of security cameras, payment information, location and content of communications data). Note that the examples given of “sensitive” data in the Standard do not align with the types of data that are considered “special categories” under Article 9 GDPR. For instance, images captured by security cameras, payment, and location data do not inherently fall under said provision, unless they reveal or may reveal particularly sensitive data (e.g., a person’s health conditions, sexual orientation or political leanings).
Manufacturers are expected to inform users clearly about different data protection aspects. We mention six of them: the external sensing capabilities of the device (e.g., optic or acoustic sensors); what personal data is processed (including telemetry data, which should be restricted to the minimum necessary for service provision); and – for each device and service – how the device or service is being used, by whom (e.g., by third parties, such as advertisers), and for what purposes.
Where consent is necessary for specific processing purposes, there should be a mechanism to seek valid consent from users and to allow them to withdraw such consent (e.g., through the device’s user configurations). Users should also be given an easy way (with clear instructions) to delete their data, including their details, personalized configurations, and access credentials. If users use this functionality, they should receive confirmation that their data has been deleted from services, devices, and apps. The confirmation is particularly important in cases of transfer of ownership, temporary usage, or disposal of the device.
Beyond the limited data protection-focused provisions, the Standard mainly recommends technical and organizational measures for manufacturers to attain an appropriate level of cybersecurity in the consumer IoT devices they design. These measures can include:
- Examining telemetry data (e.g., usage and measurement data) and validating input data to detect security anomalies and prevent potential gaps and weaknesses;
- The use of unique per-device password authentication mechanisms or even multi-factor authentication (MFA);
- Creating suitable policies and procedures for vulnerability reporting, monitoring, and timely response;
- Enabling secure software updates, which should be simple for the user to apply and automatically triggered whenever needed to prevent vulnerabilities;
- Implementing secure storage of sensitive security parameters and secure communications (e.g., through best practice cryptography and authentication); and/or
- Minimizing exposures to attacks, notably by reducing the unauthenticated disclosure of security-relevant information (e.g., software version) and only enabling software services that are used or required for the devices’ intended use.
2. ETSI’s “conformance assessment” methodology
ETSI’s August 2021 specification provides developers, manufacturers, vendors, distributors (i.e., Supplier Organisations, or ‘SOs’) with test scenarios that they can leverage for testing their new IoT products (“DUTs”, which stands for Device Under Test) against the baseline requirements set out in the Standard.
SOs are required to request Test Laboratories (TL) – entities such as independent testing authorities, user organizations, or an identifiable part of a SO that carries out conformance assessments – to test the relevant product against the Standard. This means that the conformity assessment can either be led by a third-party (testing authority), second-party (user organization) or the SO itself (self assessment). Moreover, SOs should provide TL with all necessary information, including the Implementation Conformance Statement (ICS) and the Implementation eXtra Information for Testing (IXIT). Test Laboratories must operate competently to be able to generate valid results. The requirements for competence of TLs and independence of testing authorities acting as TLs are not developed under the ETSI framework.
Figure 1 contains a visual summary of the assessment procedure according to the specification. For brevity, we refer to the specification for the terminology.
Existing security certifications or third-party evaluations of the IoT device and/or its parts may be used partially as conformity evidence to complement or inform the assessment under the Specification. In this regard, the SO shall provide all necessary information (e.g., certification, certification details, and test reports) to verify the evidence to the TL.
3. Can we draw a parallel with Conformity Assessments under the AI Act?
In Section 3 we explore whether a conformity assessment performed under the proposed AI Act can be used to inform the conformance assessment of a consumer IoT device, or the other way around. For that, we first need to clarify the scope of the two assessments.
- The AI Act conformity assessment refers to high-risk AI systems, which we analyze in more detail in another dedicated blog post). ETSI’s conformance assessment refers to consumer IoT devices which are non-exhaustively listed under the Standard (ETSI EN 303 645 V2.1.1).
- Under the AI Act proposal, a high-risk AI system is assessed in terms of its conformity to the requirements listed under Chapter 2 of Title III. These relate to the quality of used datasets, technical documentation, record-keeping, transparency, human oversight, robustness, accuracy and cybersecurity. Under ETSI’s conformance assessment, a consumer IoT device is assessed against the Standard (ETSI EN 303 645 V2.1.1), which is mainly cybersecurity-oriented.
- While the conformity assessment under the AI Act for high-risk AI systems will become legally mandatory for the systems’ providers once the final text is passed, ETSI’s conformance assessment is optional for IoT consumer products’ SOs, as it merely serves to attest the security and reliability of such products to consumers and other players on the market, and to obtain security labels on their products.
With regards to the latter distinction, it is worth noting that the requirement to carry out a conformity assessment applies to manufacturers of certain connected products. Manufacturers of “internet-connected radio equipment” under the RED’s Delegated Act are required thereunder to carry out a conformity assessment (Article 17), draw up an EU declaration of conformity, and affix a CE-marking (Article 10). Manufacturers can follow harmonized standards for conformity assessments to the extent these have been published in the Official Journal of the EU (see also, the European Commission’s standardization request). ETSI’s methodology remains optional for manufacturers of internet-connected radio equipment, even if it can be seen as a good guideline for the upcoming standards, and give them an advance start towards compliance with the RED and its Delegated Act. For completeness, we remark that the Delegated Act entered into force in January 2022 and will be enforceable by mid-2024.
A consumer IoT device under the ETSI framework and a high-risk AI system under the AI Act could co-exist if the following conditions are cumulatively met:
- An IoT device which belongs to one of the devices non-exhaustively listed under the Standard is also covered by the Union harmonization legislation listed in Annex II of the AI Act Proposal (e.g., connected children’s toys);
- The IoT device embeds an AI system which functions as a safety component of the device; and
- The device as a whole requires a third-party conformity assessment under the New Legislative Framework legislation listed in Annex II of the proposed AI Act.
We further note that, in theory, high-risk AI systems “intended to be used for the ‘real-time’ and ‘post’ remote biometric identification of natural persons” – covered by the AI Act’s list of high-risk AI systems under Annex III (point 1) – can be integrated into a consumer IoT device (e.g., authentication through iris scan in a smart TV, or through voice recognition in smart home assistants).
In these cases, the provider of the AI system shall either: (i) carry out a first-party conformity assessment, provided it has applied harmonized standards referred to in Article 40, or, where applicable, common specifications referred to in Article 41 AI Act; or (ii) involve a third-party (notified body) in its conformity assessment. This means that where “the provider has not applied or has applied only in part harmonized standards referred to in Article 40, or where such harmonized standards do not exist and common specifications referred to in Article 41 are not available, the provider shall follow” the third-party conformity assessment route. Thus, until such approved harmonized standards or common specification become available, third-party conformity assessments will remain the rule for ‘real-time’ and ‘post’ remote biometric identification AI systems which are not strictly prohibited under Article 5 of the AI Act Proposal.
For more details on when a high-risk AI system is subject to a first- or third-party conformity assessment, you can read our earlier blog post here.
FPF Training: GDPR Data Protection by Design and by Default
Interested in learning more about the GDPR’s Article 25? Join us October 25 for an upcoming training session on Data Protection by Design and by Default (DPbD&bD) to hear more from FPF experts about responsibilities when engaging service providers, the role of technical and organizational measures, and Article 25’s intersection with AI and privacy enhancing technologies.
If the above conditions are met, the IoT device embeds a high-risk AI system and thus both assessments (under the AI Act and the Standard) are triggered, even if – as previously mentioned – ETSI’s conformance assessment is optional for SOs. Then the question of how the two assessments relate becomes relevant. According to the AI Act, compliance of the high-risk AI system with specific requirements should be assessed as part of the conformity assessment already foreseen for the IoT device (see Recital 63 and Art 43(3) AI Act). Furthermore, according to Article 24 and Recital 55 of the AI Act, it is the manufacturer of the IoT device that needs to ensure that the AI system embedded in the IoT device complies with the requirements of the AI Act.
It is possible that a device manufacturer may use a high-risk AI system already placed on the market by another supplier. In case the AI system has gone through a conformity assessment, then the IoT device manufacturer could use the existing assessment as a building block to perform the conformance assessment of the IoT device under ETSI’s methodology. This becomes particularly relevant given the fact that the proposed AI Act contains requirements for high-risk AI systems that resemble some of the ones contained in the ETSI European Standard. Most notably, these include requirements relating (1) to the automatic recording of events (‘logs’), (2) the transparency towards users, and (3) ensuring an appropriate level of accuracy, robustness, and cybersecurity. The latter requirement includes ensuring that the AI system is:
- resilient as regards errors, faults, or inconsistencies that may occur within the system or the environment in which the system operates;
- robust, which may be achieved through technical redundancy solutions, including backup or fail-safe plans; and
- tamper-proof, in the sense of protected against attempts by unauthorized third parties to alter their use or performance by exploiting the system vulnerabilities.
In a different perspective, what is the practical effect on compliance with the AI Act’s requirements if a provider of a high-risk AI system embedded into a consumer IoT product passes the ETSI conformance assessment before it fulfills its conformity assessment obligations under the AI Act Proposal? By using the ETSI Standard, the manufacturer can benefit from the partial presumption of compliance with the Article 15 cybersecurity requirements set by Article 42(2) AI Act, as the Standard’s statement of conformity covers part of those requirements, as we have explained above.
Moving forward, it will be interesting to see whether more standards bodies will work on technical specifications that may be leveraged when carrying out conformity assessments under the proposed AI Act, as ETSI did for manufacturers of consumer IoT products. It will also be relevant to see whether the ETSI framework’s requirements are transposed into a technical standard for internet-connected radio equipment’s conformity assessments under the RED, and to keep up with the developments in the European Commission’s intention to propose a Cyber Resilience Act for IoT products that fall outside of the RED’ Delegated Act. The latter initiative intends to protect consumers “from insecure products by introducing common cybersecurity rules for manufacturers and vendors of tangible and intangible digital products and ancillary services”.
Further reading:
- Introduction to the Conformity Assessment under the draft EU AI Act, and how it compares to DPIAs
- Standardisation body calls for AI definition, categorisation to be decided as standards – EURACTIV.com