Next Article in Journal
A Predictive Study of a New VCR Engine with High Expansion Ratio and High-Efficiency Potential under Heavy Load Conditions
Next Article in Special Issue
A Procedure to Determine the Droop Constants of Voltage Controllers Coping with Multiple DG Interactions in Active Distribution Systems
Previous Article in Journal
Numerical Investigation of the Effect of Nanoparticle Diameter and Sphericity on the Thermal Performance of Geothermal Heat Exchanger Using Nanofluid as Heat Transfer Fluid
Previous Article in Special Issue
Fault Location Method Using Phasor Measurement Units and Short Circuit Analysis for Power Distribution Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Application of a Smart Grid Interoperability Testing Methodology in a Real-Time Hardware-In-The-Loop Testing Environment

Institute for Automation of Complex Power Systems, E.ON Energy Research Center, RWTH Aachen University, 52074 Aachen, Germany
*
Authors to whom correspondence should be addressed.
Energies 2020, 13(7), 1648; https://doi.org/10.3390/en13071648
Submission received: 28 February 2020 / Revised: 20 March 2020 / Accepted: 24 March 2020 / Published: 2 April 2020

Abstract

:
Interoperability testing is widely recognized as a key to achieve seamless interoperability of smart grid applications, given the complex nature of modern power systems. In this work, the interoperability testing methodology proposed by the European Commission Joint Research Centre is applied to a specific use case in the context of smart grids. The selected use case examines a flexibility activation mechanism in a power grid system and includes DSO SCADA, Remote Terminal Unit and flexibility source, interacting to support a voltage regulation service. The adopted test bed consists of a real-time power grid simulator, a communication network emulator and use case actors’ models in a hardware-in-the-loop setup. The breakdown of the interoperability testing problem is accomplished by mapping the use case to the SGAM layers, specifying the Basic Application Profiles together with the Basic Application Interoperability Profiles (BAIOPs) and defining the design of experiments to carry out during the laboratory testing. Furthermore, the concepts of inter- and intra-BAIOP testing are formalized to reflect complementary interests of smart grid stakeholders. Experimental results prove the applicability of the methodology for testing the interoperability of large-scale and complex smart grid systems and reveal interesting features and possible pitfalls which should be considered when investigating the parameters responsible for the disruption of a system interoperability.

Graphical Abstract

1. Introduction

To meet climate change and energy policy objectives, a major transformation of the electricity infrastructure is required. Therefore, in order to attain the technological, social and regulatory objectives, Smart Grids (SGs) were introduced as systems of systems with a broad scope incorporating electric power, information, communication, and business domains [1]. Moving in the direction of this new era of intelligent SGs, it is increasingly important to understand how the different SG components interoperate, and how the proposed standards ensure interoperability (IOP) among these components. Furthermore, new components and renewable technologies are changing the conventional power system structure. The current energy infrastructure will have to become more flexible, requiring the establishment of data communications among all actors (industrial and end users). In such a case, it is necessary to guarantee that all components work together seamlessly, i.e., they are ‘interoperable’. According to [2], IOP can be defined as the “ability of two or more networks, systems, devices, applications, or components to interwork, to exchange and use information in order to perform required functions”. In this regard, IOP was identified from the very beginning as the main challenge for the deployment of the SGs, in which technologies and companies from very diverse domains converge: electricity technologies at large, grid measurement, protection and control, Distributed Energy Resource (DER) management, industrial automation and power electronics, Information and Communication Technologies (ICTs) at large, building and home automation, smart metering. This diversity of domains gives place to the overlapping of many standards and different standardization approaches.
There is quite an extensive research work in IOP literature focusing on different SG domains with respect to their relevant standards. In this regard, and in response to the European M/490 mandate, the “CEN-CENELEC Smart Grid Set of Standards” document [3] provides a comprehensive list of standards for supporting and fostering the deployment of SG systems in Europe facilitating interoperable solutions. In particular, it provides any SG stakeholder with a selection guide in order to set out the most appropriate (existing and upcoming) standards to consider, depending on the specific SG system and the Smart Grid Architecture Model (SGAM) layer of interest.
Paper [4] explores the modelling structure of IEC 61850 standard for microgrid protection systems and highlights the IOP issues that might arise from the ambiguity and flexibility in IEC 61850 and proposes a framework for microgrid protection which can include an IOP testing to check interactions between different devices. However, it does not elaborate how the IOP testing should be done. Wind power energy communications and the standard designed in this regard (IEC 61400-25) are the focus of [5]. In this reference, a procedure is developed to implement a gateway which transforms legacy or proprietary protocols into IEC 61400-25 MMS-based protocol. The scope of [5] is the implementation of the above-mentioned gateway providing a uniform communication platform to monitor and control wind power plants. However, it does not discuss what the IOP issues might be that this platform can solve and mainly focuses on the integration of different wind turbines from different brands. Paper [6] describes sensor interface standards used in the SGs and the need for IOP testing in this regard. An IOP test system for eight commercial Phasor Measurement Unit (PMU)-based Smart Sensors (SSs) is developed and shown to verify IOP. The focus in [6] relies on the communication process between the SS client (PMU Connection Tester) and SS server (PMU under test) and to check whether this process is collaborative and the messages are compatible with the SS communication protocol.
Papers [5,6] are two examples of common practices to perform IOP testing in the SG domain. However, such common practices do not follow a methodological approach proposed specifically for this domain. Furthermore, there is literature following methodological approaches which were not originally proposed for the energy domain but could be used for SG purposes. As an example, paper [7] adapts the i-Score methodology which was initially developed in the US military context to the SG domain. The IOP Score (i-Score) model from [8] is used in [7] with a focus on improving the normalized i-Score. The IOP rating in [7] focuses on the data exchange and the five IOP levels [9] from level 0 (isolated systems) to level 4 (common conceptual model and semantic consistency). However, the IOP Score does not provide a clear picture of the technical aspects of the interactions between different systems providing a specific objective in the SG domain. In fact, the focus on [7] is towards the data interchange and the interface documentation while, for the IOP ranking, the interfaces are not scored based on their functionalities.
In a nutshell, whereas the IOP has been recognized as a crucial component for fostering grid modernization, IOP testing is still far from being commonly specified. The common practice of developing ad hoc IOP testing procedures without the adoption of well-structured methodological approaches might produce tests affected by side effects such as lack of reproducibility, poor quality, longer development time and higher cost. In this context, the Smart Grid Coordination Group (SG-CG) of CEN-CENELEC has delivered the “Methodologies to facilitate smart grid system interoperability through standardization, system design and testing” report [2]. In [2], a methodology is illustrated for achieving IOP of SG projects through: use case (UC) creation and system design; definition of IOP profiles based on UCs, standards and specifications; compliance, conformance and IOP testing. This methodology is supported by the provision of a SGAM-based “IOP Tool”, an Excel tool that helps the user to find the required standards for specification, profiling and testing under an IOP perspective. It can be noted, however, that only a very small subset of the more than 500 standards therein reported have been actually tested under an IOP perspective.
To fill this gap and as extension and further refinement of [2], a systematic approach for developing SG IOP tests has been proposed by the Smart Grid Interoperability Laboratory (SGILab) at the European Commission Joint Research Centre (EC-JRC). In [10], the JRC-SGILab has proposed a structured framework for IOP testing which consists of a step-wise procedure for the experimenter/developer to smoothly design an IOP test. The activities of the suggested IOP testing process need to be explicitly followed and involve: (1) UC elaboration; (2) specification of Basic Application Profiles (BAPs); (3) elaboration of the Basic Application Interoperability Profiles (BAIOPs); (4) statistical Design of the Experiments (DoE); (5) laboratory testing; (6) statistical Analysis of the Experiments (AoE). The tutorial paper [11] provides a condensed insight of the methodology and shows how it could be used successfully, focusing on the first three steps of the methodology (i.e., UC definition, BAPs and BAIOPs elaboration) for a demand side management UC. In [12], a small-scale advanced metering infrastructure UC exemplifies the applicability of the methodology reported in [10], focusing in particular on the DoE procedure.
The objective of the research work described in the present paper is the specific application of the JRC-SGILab methodology reported in [10] for a Distribution Management System (DMS) UC—according to the categorization proposed in [3]—with the scope of performing an IOP testing. In contrast to the UCs analyzed in [11,12], the application of the methodology for a DMS UC is performed for the first time at the best of the authors’ knowledge. The selected UC studies an exemplary chain of flexibility activation which provides the Distribution System Operator (DSO) with the voltage regulation service. For accomplishing this voltage support service, the flexibility activation chain is characterized by the following three mutually interacting actors: the Supervisory Control And Data Acquisition system of the DSO (DSO SCADA), a field gateway in the form of a Remote Terminal Unit (RTU) and a flexibility source in terms of a flexible load (FLEX). Using a widely known terminology within the testing community, these three actors can be considered to be building the Equipment Under Test (EUT), since the functionality of the flexibility activation chain as a whole is investigated under an IOP perspective. However, in this research work, no physical hardware implementation is done, instead models of these three actors are employed. A specific test bed is adopted, which includes a testing environment comprising real-time power grid simulator and communication network emulator. The test bed used in this research work is based on a hardware-in-the-loop (HiL) setup.
This research work stems from the SGAM-based interoperability/interchangeability study carried out in the European Union Horizon 2020 InterFlex project [13], where six different European demonstration sites were realized with a focus on flexibility services, spanning from energy generation to demand. Two findings of [13] are used in this research work to build the architecture and the functionality of the selected UC, as follows.
  • In [13], two main architectures are identified (out of the different European demonstrators implemented therein) which are at disposal for the DSO to activate the flexibility source, namely via a direct (“lower-bound”) or an indirect (“upper-bound”) interface. More specifically, the DSO could either employ its own field gateway (e.g., an RTU) to access the sources of flexibility (lower-bound architecture) or, alternatively, the DSO SCADA system can be first interfaced with an intermediate actor (an aggregator or an energy management system) and then access the flexibility via a field gateway (upper-bound architecture). The reader is referred to [14] for more details. The first alternative (lower-bound architecture) is employed for building the architecture of the UC selected for this research work, while the second one is not addressed.
  • In [13], a set of “super-categories” is identified as a set of possible DSO services supported by the activation of a flexibility chain across the different European demonstrators implemented therein. More details in this regard are provided in [15]. Out of all the super-categories identified (e.g., congestion management, frequency support, voltage support, etc.), the “voltage support” service is used in this research work for defining the functionality of the selected UC.
In addition, in the present paper, connections with [13] are made where needed to motivate some specific assumptions and applications. For example, some of the telecommunication technologies considered during the BAP creation or the intervals of variation of some input parameters during the DoE step do reflect some demo-specific implementations of [13].
In the adopted UC, the lower-bound architecture (employed by the DSO to activate the flexibility source) for providing a voltage support service is analyzed under an IOP point of view. In particular, the performance of the system represented by the chain DSO SCADA ↔ RTU and RTU ↔ FLEX is investigated under different operational conditions (considering, during the DoE phase, heterogeneous parameters, i.e., related to the voltage support service as well as to the communication infrastructure), evaluating their effect on influencing the IOP testing verdict. Specifically, the outcome of IOP testing is defined in terms of a “pass” or “fail” verdict, considering the final value of the restored voltage at a specific power grid node after the flexibility activation. The results lead to a qualitative ranking of the input parameters in terms of relative importance in affecting the system performance.
This approach of IOP testing can be mapped to the so-called “active interoperability testing”, which is the alternative of the “passive interoperability testing”. In [16], the difference between these two approaches is extensively addressed. The passive IOP testing can be seen as a simple “monitoring” of the EUT to detect possible IOP issues (faults) when the EUT works in normal conditions. On the other hand, the active IOP testing has the scope to detect IOP faults on a given EUT by applying a series of stimuli, hence disturbing the normal EUT operation. Examples of passive IOP tests are more easily found in the literature, being more straightforward and less consuming in terms of testing system deployment. For example, reference [17] proposes a passive IOP test method for IEC 61850-9-2-based multi-vendor merging units, which exchange information with a protection relay. The IOP testing result is in the form of a “pass” or “fail”. Similarly, paper [18] performs a passive IOP testing between multi-vendor digital substation devices and the test results are again in the form of IOP success or failure. These are some examples of common practices of passive IOP testing. However, in the research work described in the present paper the active IOP testing approach is chosen, since the interest is in investigating the system IOP under different operational conditions.
It is noteworthy that the methodological approach followed and applied in this research work for IOP testing purposes is not restricted to the specific UC. In particular, the same procedure can be applied not only to the UCs implemented in all the demonstrators of [13] (which provide a comprehensive and applicative environment for an IOP testing in the SG context), but also to any other set of demonstrators/UCs in the SG domain. Of course, the specific instantiation of the IOP testing methodology has to take into account the peculiar properties of the UC under analysis (e.g., architecture, functionality, etc.) so to design a meaningful and robust IOP testing specifically tailored for the SG application under study. In fact, not only will the definition of BAPs, BAIOPs and DoE be different (depending on the specifics of the developed UC), but also proper modifications and/or extension of the adopted test bed as well as the selected EUT have to be operated when considering different UCs and other (physical or modelled) equipment. Nonetheless, the considerations made in this research work may be useful and reusable in the sense that general directions to take into account when applying this IOP testing approach are provided and highlighted to facilitate specialized engineers and/or SG stakeholders when dealing with the IOP testing challenge.
The main contributions of this research work are as follows.
  • After being applied to the examined UC, the JRC-SGILab methodology has revealed to be a flexible and valuable tool to successfully accomplish the breakdown of the IOP testing into a structured framework. The broad-scope range of applicability of this methodology is proven to be promising and a variety of different SG applications can undoubtedly benefit from it when tackling the challenge of the IOP testing.
  • Beside a thorough definition of BAPs and BAIOPs, the experimental design has been found to be a crucial activity to thoroughly characterize complex systems and investigate their performance under an IOP perspective. The systematic laying out of a detailed experimental plan should be an essential step before performing an IOP testing of any SG application.
  • From the collected results, it is shown that, without the integration (within the IOP testing workflow) of fine statistical tools, it is challenging to deepen specific features of the system IOP behavior (for example, the ranking in quantitative terms of the system parameters or the identification of mutual interactions which may be important in driving the IOP verdict).
  • The concepts of inter- and intra-BAIOP testing are proposed in this work to reflect different (but possibly complementary) interests of SG stakeholders, i.e., the evaluation of the system IOP across multiple BAIOPs and/or within one specific BAIOP.
  • From this research work, SG stakeholders can have a useful insight into possible downsides to take into account when they have to deal with the IOP testing challenge in large-scale and complex power systems as well as future SG applications similar to the one addressed with the examined UC.
The rest of the paper is organized as follows. Section 2 describes the adopted JRC-SGILab methodology, its application to the examined UC for the purpose of IOP testing and the concepts of inter- and intra-BAIOP. Section 3 provides and comments the experimental results. Section 4 includes the discussion which can be derived from this research work. Section 5 provides the conclusions.

2. Material and Methods

This section first presents the JRC-SGILab methodology followed in this research work (Section 2.1) and the test bed adopted (Section 2.2). In Section 2.3 the application of the JRC-SGILab methodology is thoroughly described from the UC definition to the DoE and test design and the concepts of inter- and intra-BAIOP testing are proposed.

2.1. The JRC-SGILab Adopted Methodology

For investigating the performance of the system represented by the chain DSO SCADA ↔ RTU and RTU ↔ FLEX under an IOP perspective, this research work adopts the basic methodological directions proposed in [10] by the JRC-SGILab. This “Smart Grid interoperability testing methodology” report presents best practices for UC developers, experimenters and analysts to perform an integrated IOP testing for SG applications. The step-by-step procedure suggested by the JRC-SGILab is schematically depicted in Figure 1, where the six activities to be performed for the IOP testing are shown in the dark blue boxes.
The documentation of the methodology proposed by the JRC-SGILab also includes some templates which have to be filled out during the testing process. Excerpts of these documents are reported in the rest of the paper as needed. In this subsection, the theoretical aspects of the adopted methodology are briefly described, while its specific application to the selected UC is detailed in Section 2.3.
The adopted IOP testing methodology starts with the creation of the UC, in which the description of the possible sequences of interactions between the actors/components of the system under study is related to a specific scope/functionality. The UC description helps in formalizing all the functional and non-functional requirements related to the behavior of the system and serves as the basis for designing the experimental tests as well as the configuration of the test bed.
The profiling phase is composed of BAP and BAIOP creation. During the BAP definition, it has to be determined which standards (or set of standards) are considered, which options from these standards are selected and how they are used to reach the desired UC functionality. In particular, for defining the BAPs, the standards (and their options) specifying the information flows between all the UC actors/components are considered, taking into account also possible alternative ways for the actors to interact. Since each standard could specify different options, each interaction (interface) between actors may be characterized by more than one BAP. After the BAPs definition for all the interfaces is laid out, the BAIOPs need to be specified for describing how the IOP tests (in the SGAM layers of interest) have to be performed under non-stress conditions. In particular, each BAIOP includes a unique combination of BAPs for all the interfaces involved in the UC. The whole set of BAIOPs is meant to be the basis for defining the test case(s) that will be run in the experimental phase to prove that the functions described in the UC are correctly supported under an IOP perspective.
The DoE is a systematic approach for laying out in advance the experimental plan, taking into account the objective of achieving an efficient production of experimental data within a “resource-saving” context. In particular, the DoE determines how to alter the system parameters to assess whether system IOP is met or not under different operational conditions (possibly including also stress conditions). The main steps of an effective DoE procedure are as follows.
  • Define the scope of the experiments.
  • Identify the system response(s)—or output(s)—which must be measured.
  • Identify the process variables—or input factors—which may affect the system output(s).
  • Statistically characterize each process variable (i.e., by defining their ranges of variation).
  • Sample N values within the ranges of variation of each process variable.
The DoE procedure feeds the testing phase, in which the experimental points are tested in the laboratory environment or, alternatively, used to feed the analytical model if the experimenter has access to the governing equations.
Once the experimental results are collected, statistical AoE is performed on the acquired data to specifically extract interesting features and meaningful interpret the results under an IOP perspective. In particular, the integration within the methodology of the statistical DoE and AoE allows for the effective investigation of which are the factors responsible for jeopardizing the system IOP (in the case that the system functionality can be influenced by different parameters).

2.2. Adopted Test Bed

For assessing the selected UC under an IOP point of view, the test bed represented in Figure 2 is adopted based on an HiL setup. It consists of the models of the electrical and communication grids as well as the actors involved in the examined UC. This setup represents an environment in which different actors and components of the electrical and communication grids interact to provide the voltage support service of the DSO.
More specifically, Controller HiL (CHiL) is selected as an HiL system with the MATLAB Simulink® modelling environment to model the power grid [19]. The CIGRE European Low Voltage (LV) distribution network benchmark model [20] depicted in Figure 3 is modelled and interfaced with the OPAL-RT real-time simulator via RT-LAB®. For emulating the communication network, the network emulator NRL CORE [21] running on a laptop with USB-to-RJ45 connectors has been used. The three actors participating in the flexibility activation chain (i.e., DSO SCADA, RTU, and FLEX) are modelled and run on three Raspberry Pi single-board control boards. For a more detailed presentation about the test bed, the reader is referred to [15,22].
It is noteworthy that the first version of the test bed adopted in this work can be modified for performing similar IOP tests of UCs examining other functionalities (such as the “congestion management” DSO service identified in [15]). Furthermore, as depicted in the right part of Figure 2, the test bed can be extended via a Virtual Private Network (VPN) connection such that the inclusion of external/internal real hardware representing flexibility devices is possible, in order to conduct Power-HiL (PHiL) tests [23].

2.3. Application of the JRC-SGILab Methodology in the Examined UC

In this subsection, details of how the JRC-SGILab methodology is applied in this research work are given. Information is provided regarding the UC elaboration (Section 2.3.1), the BAPs and BAIOPs specification (Section 2.3.2 and Section 2.3.3) as well as the definition of DoE (Section 2.3.4) and IOP testing (Section 2.3.5).

2.3.1. Description of Use Case

The UC examined in this research work studies the system representing the flexibility activation chain DSO SCADA ↔ RTU and RTU ↔ FLEX, where DSO SCADA, RTU and FLEX mutually interact to provide the DSO with a voltage support service as described hereafter. This UC is analyzed for IOP purposes by studying how different operational conditions (including parameters related to the three actors as well as to the communication infrastructure) are able to affect the system performance and therefore the system IOP.
The actors participating in the flexibility activation chain are defined (according to [24]) in Table 1, where some additional UC specific information is also reported.
To properly feed the profiling phase (Section 2.3.2 and Section 2.3.3), a thorough analysis of the interactions between these actors is needed. The message sequence chart illustrated in Figure 4 shows the interactions between the three UC actors.
The specification of the information flows and the UC mechanism is described in Table 2, where the UC step-by-step analysis is shown. In the examined UC, the DSO SCADA constantly monitors the voltage level of the power grid (steps 1 and 2). As soon as the voltage falls outside a predefined threshold, DSO SCADA detects it (step 3). Consequently, the DSO requires support from the FLEX located at the customer side, by sending a flexibility activation signal via an RTU (step 4) towards the FLEX (step 5). The FLEX offers its voltage support service in the attempt to make the system stable again: the FLEX dispatches all its amount of flexibility (in terms of power injection) into the power grid node where it is located (step 6). After its activation, FLEX acknowledges back to the RTU (step 7) and then to the DSO SCADA (step 8). By keeping on monitoring the power grid and reading the node voltages from the grid, the DSO SCADA reports back the final voltage value when the whole available flexibility amount is injected into the system. It is noteworthy that the monitoring behavior of the DSO SCADA (steps 1 and 2) is conducted in a continuous manner irrespective of the activation of FLEX. The performance of the whole system in restoring the node voltage within the predefined threshold is analyzed under an IOP point of view, as better detailed in Section 2.3.5.
In this UC, the following assumptions are made:
  • Whenever a flexibility activation command for the voltage support service is sent, there is already a certain amount of flexibility which could be activated to support this service.
  • After the voltage deviation is detected by the DSO SCADA, the signal sent by the DSO SCADA to the FLEX is in the form of a flexibility activation command, i.e., no activation request schema is modelled and no negative answers to this flexibility activation command are considered.
  • The voltage deviation detected by the DSO SCADA, which triggers the sending of a flexibility activation command to the FLEX, is due to a fixed disturbance which occurs at a specific node of the power grid and leads to a voltage drop. As soon as this situation is detected by the DSO SCADA, the RTU sends the activation signal to the FLEX which is always ready to deliver all its flexibility (the certain amount mentioned above which could be in an aggregated form and is ready to be activated) to the same node.
  • The voltage disturbance does not lead (in any node of the adopted power grid model) to any voltage deviation beyond the regulation limits set in [25].
  • The distribution system is operating based on the benchmark network data (reported in [20]) for lines, transformers and loads.
It should be said that the assumptions made above narrow down the study to a realistic scenario without implying any negative impact on the relevance of the IOP testing methodology. However, one can assume other scenarios (UCs) such as the consideration of different disturbances (different voltage deviations), multiple sources of flexibility at different locations in the grid, etc. Furthermore, it might be of interest to evaluate IOP for different network conditions for certain studies. For instance, the base voltages of the per-unit system, line lengths, line types, line parameters and loads can be modified as long as the typical distribution system character is maintained. In such a case, the test bed described in Section 2.2 can be modified accordingly for such studies. Furthermore, in Section 4 some more suggestions are provided for the reusability of the test bed in future works.
The selected UC can be mapped to the SGAM layers of interest in this application (namely component, function and communication) as depicted in Figure 5.
The component layer comprises the three actors of the flexibility chain interacting for providing the voltage support service, organized in the lower-bound architecture where the DSO SCADA is directly interfaced with the flexibility device through the RTU.
The function layer represents the functionality achieved by the selected UC, namely the voltage support service which needs to be provided by the flexibility activation chain in the lower-bound architecture.
The communication layer is built based on all the communication protocols used in the selected UC in order to retrieve the necessary information and control of the distribution network. The specific communication protocols regulating the data exchange at the interfaces DSO SCADA ↔ RTU and RTU ↔ FLEX in Figure 5 are specified in Section 2.3.2. A detailed, albeit not exhaustive, list of the communication standards for these interfaces can be found in [3]. In this reference, the network types for the selected interfaces are also defined and indicated in Figure 5 with the letters in the green disks (“L”, “E”).
It should be highlighted that in this work, information and business layers are not the focus of the above-mentioned SGAM mapping. However, some suggestions about the potential inclusion of business-related considerations within this UC are made in Section 4.

2.3.2. Basic Application Profiles (BAPs) Definition

According to [26], BAPs are built based on the interfaces between all the different actors involved in the UC. As depicted in Figure 5, the interaction links of interest are the interfaces DSO SCADA ↔ RTU and RTU ↔ FLEX. The list of the considered BAPs related to the interfaces between the three UC actors is shown in Table 3. The communication parameters specifying each of the BAP-forming communication technologies are reported in Table 4. It is noteworthy that in the communication SGAM layer, out of all the possible communication standards regulating the actors’ interfaces of this UC, the BAPs are defined in this research work by looking at the different state-of-the-art communication technologies implemented in the demonstrators of [13] as well as potential candidates for future implementations.

2.3.3. Basic Application Interoperability Profiles (BAIOPs) Definition

After the BAPs definition for all the interfaces is laid out, the BAIOPs need to be specified [3,26] to determine how the IOP tests have to be performed under non-stress conditions. In this research work, out of all the possible BAPs combinations in Table 3, only a subset has been considered and tested. In particular, the three BAIOPs selected for the SGAM communication layer are reported in Table 5, each one of them characterized by a unique combination of the telecommunication technologies reported in Table 4. As an example, BAIOP 1 results from the combination of BAP 1a (“Ethernet xDSL/cable” technology for the interface DSO ↔ RTU) and BAP 2a (“Fiber/local Ethernet” technology for the interface RTU ↔ FLEX). These two BAPs are characterized by the specific configuration of communication parameters (set up in the network emulator) which can be read in the third and second rows of Table 4, respectively.

2.3.4. Design of Experiments

The overall system performance in terms of the UC functionality (i.e., for providing the voltage support service) is assessed by deliberately changing the parameters which are considered, a-priori, as potentially being able to affect the system output and therefore the IOP verdict. For this purpose, the DoE procedure is employed and applied in this research work as detailed hereafter.
  • Definition of the experimental goals. In this UC, a DSO SCADA is interfaced with the FLEX via an RTU to provide the voltage support service. The objective of the UC is to investigate the IOP between these three actors under different telecommunication architectures (inter-BAIOP testing) and, for a given telecommunication architecture, under different service-related conditions (intra-BAIOP testing). Details of these two types of tests are given in Section 2.3.5.
  • Definition of the system output(s). Two system responses are identified as system outputs relevant for assessing the IOP of the flexibility activation chain.
    • Restored voltage ( V r e s ), i.e., the value of the voltage measured at a specific node after the flexibility (located at the same node) is activated in the attempt of restoring the voltage within the allowed DSO-specific voltage range. In other words, after the flexibility activation is accomplished (step 8 of Table 2), the voltage monitoring (steps 1 and 2) delivers V r e s .
    • Restoration time ( t r e s ), i.e., the time the system takes to restore the voltage at the specific node. In the case that the voltage restoration (within the DSO-specific admitted voltage range) is not successful, t r e s is given an infinite value.
  • Definition of the input factors. Two categories of input factors (i.e., parameters which potentially influence the system response) are taken into account for the DoE of the laboratory testing.
    • Communication-related input factors, related to the chosen telecommunication architecture. These communication-related input factors are: Bandwidth (Mbps), Background traffic (Mbps), Delay ( μ s), Jitter ( μ s), Packet Loss (%), Duplicate (%).
    • Service-related input factors, related to the three actors involved in the flexibility activation chain (DSO SCADA, RTU, FLEX) for providing the voltage support service. In particular, the following four service-related parameters are defined.
      RTU Processing Time ( R T U P r o c T ), which refers to the internal RTU time delay (i.e., time before RTU sends a flexibility activation signal to FLEX).
      Admitted Voltage Deviation ( A V D ), which is the “quality of service” the DSO wants to provide in terms of maximum allowable voltage deviation.
      Flexibility Response Time ( F l e x R e s p T ), which is the time required for the flexibility to activate (i.e., time that FLEX takes before injecting its flexibility amount into the specific power grid node).
      Flexibility Capacity ( F l e x C a p ), which is the total amount of flexibility available in FLEX.
  • Identification of the ranges of variation of the input factors. As suggested by the procedure, information regarding the limits of variation of the selected input factors may come from different sources (such as literature, expert knowledge, preliminary experiments, standards etc.).
    • For the service-related input factors, the references used are [25] for A V D , [27] for R T U P r o c T and [28,29] for FLEX-related parameters. More specifically, these four input factors are considered to be stochastic variables (with a specific Probability Density Function, PDF), whose ranges of variation are taken into account and reported in Table 6.
    • For the communication-related input factors, information derived from state-of-the-art telecommunication technologies (in particular, as those implemented in the demonstrators of [13]) as well as potential future implementations is used. More specifically, the choice is made, in each telecommunication technology of Table 3, to set the values of the communication-related input factors as reported in Table 4.
  • Sampling within the input factors’ ranges. Ranges are defined only for the service-related input factors, for which N values are randomly sampled from the intervals specified in the previous step and reported in Table 6. These N values for each service-related input factor are used within the intra-BAIOP testing (see Section 2.3.5).
A summary of the implementation of the DoE procedure is presented in a condensed way in Table 7, which is an excerpt of the template document provided in the JRC-SGILab report specifically filled out for the examined UC.

2.3.5. Interoperability Testing and the Proposed Concepts of Inter- and Intra-BAIOP

In the previous subsections, the first steps of the JRC-SGILab methodology have been followed, namely UC creation (Section 2.3.1), BAPs and BAIOPs definition (Section 2.3.2 and Section 2.3.3), DoE specification (Section 2.3.4). In this subsection, the IOP testing is defined and the concepts of inter- and intra-BAIOP IOP testing are proposed.

Definition of the Interoperability Testing

Testing is the following step to verify IOP of the system under study. It is noteworthy that “interoperability testing” is significantly different from “conformance testing” [2]. In fact, it can be the case that two implementations individually comply with a standard (i.e., they pass the conformance testing) but are still not able to operate together correctly in performing a predefined functionality (i.e., they are not interoperable). The JRC-SGILab methodology specifically focuses on the IOP testing.
As reported in [10], the result of the IOP testing should be either “pass” or “fail”, therefore the test should be planned and executed to lead to a clear verdict. The pass/fail verdict for the specific UC IOP testing is defined according to the following criteria:
i f F < A V D t h e n P A S S
i f F > A V D t h e n F A I L
where (assuming a reference voltage V R E F ):
F = 100 | V r e s V R E F | V R E F
Moreover, when collecting the results not only is recorded the pass/fail verdict of the IOP test, but also the values actually measured for the two system responses ( V r e s and t r e s ).
In the following paragraphs, the concepts of inter- and intra-BAIOP are proposed, which refer to the two types of IOP testing specifically performed in this research work. However, these two concepts may also assume a general value and be considered to be two variants of IOP testing to be performed (in a complementary way) while applying the JRC-SGILab methodology; therefore inter- and intra-BAIOP concepts can be seen as an enrichment/extension of the “testing” activity of the JRC-SGILab methodology (Figure 1).

Inter-BAIOP Interoperability Testing

When performing an IOP testing, the analyst may be interested in evaluating the impact of the different BAIOPs on the system performance under an IOP point of view. In this direction, the concept of “inter-BAIOP” IOP testing is proposed, in order to qualitatively and/or quantitatively evaluate which is the influence of the BAIOPs under study on the system performance and IOP.
Translating this concept in terms of the specific UC under test (where only BAIOPs for the SGAM communication layer are considered), the inter-BAIOP testing is carried out to assess the influence of using different telecommunication architectures for achieving the functionality of the UC (i.e., providing the voltage support service in the examined flexibility activation chain). As described in Section 2.3.3, three different BAIOPs are chosen to be tested against IOP (reported in Table 5), each one of them characterized by a specific combination of telecommunication technologies supporting the interfaces between the three actors of the flexibility chain. Hence an analysis across the three selected BAIOPs is performed.
More in detail, the set of inter-BAIOP experiments performed within the research work is planned as follows. For each BAIOP (with the specific configuration of communication-related parameters as specified in Table 4), the service-related input factors are fixed at a predefined value, given the non-trivial experimental acquisition time. In particular, these service-related parameters are set at their mean value, taking into account the respective ranges of variation as defined in Table 6. This way, three test cases have been formalized (one for each BAIOP): by running the laboratory experiments with these specific input configurations, the system performance across each telecommunication architecture is assessed by measuring the system response in terms of V r e s and t r e s . The results for the inter-BAIOP testing are shown hereafter in Section 3.1.

Intra-BAIOP Interoperability Testing

Beside the evaluation of the system performance across different BAIOPs (inter-BAIOP testing), the analyst is also interested in assessing the system performance within a given BAIOP under an IOP perspective. The concept of “intra-BAIOP” IOP testing refers exactly to this type of testing.
Speaking in terms of the examined UC, the purpose of intra-BAIOP tests is to analyze how the variation of the different service-related parameters related to the different actors involved in the flexibility chain will impact the system response ( V r e s and t r e s ) within a specific telecommunication architecture (BAIOP) and therefore the IOP verdict. For this purpose, the DoE determines the strategy for sampling the space spanned by the service-related inputs so to properly perturb the system in order to verify whether the system IOP holds under different conditions.
In theoretical terms, an intra-BAIOP testing may be carried out for each of the (three) considered BAIOPs. However, it may be of interest for the experimenter to focus only on one specific BAIOP: in this case, the criterion for the "most interesting" BAIOP selection may be based on the evaluation of the inter-BAIOP testing results. Specifically, in this research work the telecommunication infrastructure which delivers the “best” system response (in terms of “optimal” combination of V r e s and t r e s from a DSO’s perspective) as determined by the inter-BAIOP testing is the one further investigated under an intra-BAIOP perspective. The results for the intra-BAIOP testing are presented hereafter in Section 3.2. Table 8 provides an excerpt of a JRC-SGILab Test Description document, which has been specifically filled out for defining one intra-BAIOP test case.

3. Results

In this section, the results for both inter-BAIOP and intra-BAIOP tests are illustrated and the respective conclusions are drawn. It is noteworthy that these conclusions are bound to the modelling environment (both grid and component models) and the assumptions used in this UC in order to set the boundaries for the IOP testing.
The following convention is adopted for the plots presented from Figure 6, Figure 7, Figure 8, Figure 9 and Figure 10. The values of V r e s (readable on the right axis) are indicated with black diamonds, while the values of t r e s (readable on the left axis) are indicated with vertical bars. The information regarding the IOP verdict is carried by the color of the vertical bars: red and green bars indicate the outcome of “fail” and “pass”, respectively.

3.1. Inter-BAIOP Interoperability Testing

In this subsection, the results of the experiments with respect to the inter-BAIOP IOP testing are provided. The scope of this inter-BAIOP test is to assess what is the influence of selecting different BAIOPs on the system performance under an IOP perspective.
The three experiments for the inter-BAIOP tests lead to the results reported in Figure 6, where the different possible telecommunication architectures (BAIOPs) considered in Table 5 are investigated. The performance of the system represented by the flexibility activation chain is examined in terms of V r e s and t r e s values, and the IOP verdict is recorded. From Figure 6 the following conclusion can be derived.
The three different BAIOPs can equally support the voltage deviation and restore the voltage to V r e s = 0.96 per unit. Evaluating the system response in terms of t r e s , the restoration time shows more dependency on the specific telecommunication technology combinations.
The IOP verdict is “pass” in each of the three tested configurations: when the system is placed under non-stress conditions (mean values of the service-related parameters within their intervals of variation), all the three BAIOPs successfully pass the IOP test. Therefore, speaking in terms of IOP verdict, in this situation the system IOP is not affected by the selection of different BAIOPs.
The results deriving from this inter-BAIOP testing can be used, for example, by the DSO in the following manner. Given that all the three BAIOPs successfully pass the IOP test, the attention of the DSO, in order to “rate” the quality of the voltage support service under different telecommunication architectures, may be aimed towards the one which is capable of delivering the optimal combination of t r e s and V r e s . In this specific case, the BAIOP able to deliver the best t r e s (i.e., the least restoration time) would be chosen.
It has to be highlighted that other strategies could be used for this inter-BAIOP testing: for example, instead of using mean values for the service-related parameters in each BAIOP, the sampling of more experimental points might have been an alternative, as well as choosing another inter-BAIOP “indicator” (for evaluating the effect of selecting different BAIOPs) instead of the same system responses used for the intra-BAIOP testing (i.e., V r e s and t r e s ). The choice of other indicators for evaluating the system performance across BAIOPs is left as future work direction.

3.2. Intra-BAIOP Interoperability Testing

In this subsection, the results of the experiments for the intra-BAIOP tests are shown. The aim of these intra-BAIOP tests is to assess how the system performance (and therefore the IOP verdict) is affected, within a specific BAIOP, when changing the configuration of the service-related parameters (potentially considering also stress conditions).
As described in Section 2.3.5, intra-BAIOP testing is carried out within a specific telecommunication architecture (i.e., BAIOP, with values of the communication parameters as specified in Table 4). In particular, the BAIOP which has provided the best results after the inter-BAIOP testing (Figure 6) is worth investigating (in other words, the information derived from the inter-BAIOP testing can guide the intra-BAIOP testing). In this particular case, it is assumed that the DSO (in order to be able to deliver the “best” quality of service according to its own preference) chooses BAIOP 1, where an optimal combination of restored voltage and restoration time is achieved. For this selected telecommunication architecture, the intra-BAIOP testing provides an insight about the performance of the flexibility activation chain under an IOP point of view, by assessing the system behavior for different configurations of A V D , F l e x C a p , F l e x R e s p T and R T U P r o c T . In particular, for each BAIOP the N test cases are produced by letting each service-related input factor, one after another, free to vary within its interval of variation while the others are set at their mean values.

3.2.1. DSO-Related Intra-BAIOP Testing

To analyze the impact of the A V D that the DSO considers for the voltage support service, an intra-BAIOP testing is carried out by setting R T U P r o c T , F l e x R e s p T , F l e x C a p to their mean values while A V D varies in the tolerance range that the DSO wants to provide for its voltage support service. In particular, the interval of variation defined for A V D (see Table 6) is less than the voltage deviation at the node under study caused by the disturbance. A random sampling is performed within the range of variation of A V D .
The experimental results after running this set of test cases can be observed in Figure 7.
The following conclusions can be drawn from Figure 7. The quality of service ( A V D ) that the DSO wants to provide after a disturbance can affect the restoration time ( t r e s ). On the other hand, the amount of restored voltage ( V r e s ) is not affected at all, since it is dependent on the (fixed) amount of flexibility ( F l e x C a p ) available at the specific node. Moreover, it can be noticed that, when the flexibility and RTU-related parameters are fixed at their mean value, there is a minimum amount of A V D required for the system to restore; below this threshold, the IOP test fails (red bars at low values of A V D ). In other words, the DSO cannot expect to increase the quality of service (i.e., decrease A V D ) while the amount of available flexibility at the customer side ( F l e x C a p ) is fixed.

3.2.2. Flexibility-Related Intra-BAIOP Testing

The impact on the system response of FLEX-related input factors ( F l e x R e s p T and F l e x C a p ) is then investigated. For this purpose, the values of A V D and R T U P r o c T are set to the mean value in their variation range (see Table 6).
First, to analyze the effect of F l e x C a p on the system performance, F l e x R e s p T is also set at its mean value, letting F l e x C a p free to vary within its range of variation.
As shown in Figure 8, F l e x C a p has an influence on both V r e s and t r e s , as well as on the IOP verdict. It can be observed that there should be a minimum amount of flexibility to restore the voltage. This aspect needs to be taken into consideration by the DSO while making the contractual agreement with the flexibility owner: if the amount of F l e x C a p is below a certain amount, the IOP test will fail.
Similarly, to analyze the effect of F l e x R e s p T on the system response, F l e x C a p is now set at its mean value, with F l e x R e s p T left free to vary. The result can be observed in Figure 9.
As observed, F l e x R e s p T influences the system response only in terms of the restoration time, the restored voltage value not being affected at all. No “fail” IOP verdicts are recorded in this specific input configuration.

3.2.3. RTU-Related Intra-BAIOP Testing

Finally, the parameters related to DSO ( A V D ) and FLEX ( F l e x R e s p T , F l e x C a p ) are set to their mean values, while R T U P r o c T varies within its predefined range. The results can be observed in Figure 10.
As observed, R T U P r o c T influences the behavior of the system only in terms of the restoration time. Similar to Figure 9, there are no failure situations with the specific operational conditions.
In addition, by comparing the results of Figure 9 and Figure 10 (in terms of t r e s ), it can be concluded that F l e x R e s p T has a higher impact on affecting the restoration time compared to R T U P r o c T .
Collecting all the results shown in this subsection for the intra-BAIOP testing (bound to the assumptions made for modelling the UC), the following conclusions can be additionally drawn. For a given BAIOP, when the system represented by the flexibility activation chain is studied for different configurations of the service-related input factors (possibly considering also stress conditions), its performance as well as the IOP verdict are affected in a different manner: some parameters are able to change the IOP verdict, while other ones seem not important in leading the variation of the system IOP outcome. In qualitative terms, it can be concluded that A V D and F l e x C a p have a higher importance with respect to F l e x R e s p T and R T U P r o c T , since they are able (with their variation) to change the IOP verdict from “fail” to “pass”. However, a quantitative ranking (in terms of importance) of the service-related input factors (in accordance to the selected system responses) is challenging.
Moreover, there are minimum values of A V D and F l e x C a p (bound to the specific service-related input configuration used for conducting these intra-BAIOP tests) which the DSO needs to take into account for guaranteeing system IOP, and therefore it seems that interactions do exist between these input factors.
Given these considerations, in order to derive conclusions in terms of quantitative ranking of the input factors, quantification of the interactions and “IOP thresholds”, more advanced statistical tools are needed, which might effectively assist the SG stakeholders in getting this type of information.

4. Discussion

In this work, an application of the JRC-SGILab methodology proposed in [10] is presented for IOP testing within the context of a DMS UC, by using a real-time simulation environment with a CHiL technique. The examined UC describes the flexibility activation chain involving DSO SCADA, RTU and FLEX, mutually interacting to provide the voltage support service.
Given the difficulty of the IOP testing problem, in this research work a lot of attention has been put on the application of methodological aspects formalized in the JRC-SGILab methodology report [10], namely (1) UC creation, (2) BAP and (3) BAIOP definition, (4) DoE procedure, the latter specifically applied in the UC under study since several parameters can change and affect the system functionality under an IOP point of view. The JRC-SGILab methodology has revealed to be a valuable tool to properly support the breakdown process of the IOP testing problem into a structured framework.
In this research work two types of testing, namely inter- and intra-BAIOP tests, are carried out to evaluate the system performance and IOP across different BAIOPs and within a specific BAIOP, respectively. These two types of IOP testing can be seen as an extension/enhancement of the “testing” activity of the JRC-SGILab methodology [10], since they can be used in a complementary manner by SG stakeholders when their interest is assessing the system IOP both across the selected BAIOPs (inter-BAIOP IOP testing) and within a specific BAIOP (intra-BAIOP IOP testing). If the concept of intra-BAIOP testing is somehow implicit in [10], the concept of inter-BAIOP testing is here specified for the first time and can be further investigated. For example, it can be adapted (depending on the UC under analysis) to formalize a UC specific indicator which can effectively assist DSOs and SG stakeholders in general during their decision making process while examining system IOP across different BAIOPs.
It has to be highlighted that the AoE step (involving sensitivity analysis, metamodeling, adaptive strategies) of the JRC-SGILab methodology, even if of high interest for the application of this research work, is not specifically applied to the selected UC. In fact, this research work was originated from the motivation to preliminarily start “setting the scene” of an IOP testing layout of a large-scale and heterogeneous system (i.e., entailing parameters not only related to the power grid but also to the communication), before already performing more advanced statistical analyses, whose effectiveness can be fully recognized in a system with higher maturity level. Nonetheless, this first application of the methodology has made possible revealing some interesting features and potential drawbacks which need to be taken into account when performing IOP testing in UCs and systems similar to those of this research work. These emerged considerations are discussed hereafter and will be guiding the authors in the refinement of the test bed as well as the UC, and the implementation of the AoE procedure in an already known environment.
It is emerged how critical the DoE phase is, beside the BAP-BAIOP specification. In particular, in order to carry out a full characterization of the system under analysis and a thorough IOP testing, much effort needs to be put in the definition of the input factors together with the system response(s) to properly assess the overall system performance, detect possible IOP issues and identify the parameters responsible for driving the IOP verdict. For example, in the performed intra-BAIOP testing four service-related input factors are selected during the DoE phase as being potentially able to influence the system output with their variation, while the communication parameters are fixed in each BAIOP. However, one could make the choice to select different service-related parameters (and/or system outputs), or even to increase their number. In the similar direction, while performing an intra-BAIOP testing the communication-related parameters (in this research work fixed in the intra-BAIOP testing) could also be changed (together with the service-related input factors) to investigate the relative importance of heterogeneous (i.e., service- and communication- related) parameters in jointly driving the system performance. However, while defining the ranges of variation for the communication-related parameters there would be the constraint to specify them such that no violation of the communication requirements of the selected telecommunication architecture (BAIOP) is made. In other words, if their ranges are not defined carefully, the experimenter will run into the risk to consider communication parameters which are not realistic for the specific telecommunication technology under study. Clearly, this aspect is not trivial and needs further investigation of the standards regulating the communication interfaces between the UC actors. Still, as long as the analyst is interested in the investigation of the system IOP across different BAIOPs, the concept of inter-BAIOP testing remains valid as complementary type of test.
Moreover, the choice per se of additional input parameters is not as simple as it may seem at first glance: in fact, it will inevitably result in an increase of the system dimensionality. This feature can be supported only by the deployment of more advanced statistical tools, able to effectively cope with the higher system dimensionality and complexity. One example is the integration of a proper sensitivity analysis (as suggested in [10]), which can assist the experimenter in (1) quantitatively ranking the input factors according to their importance in driving the IOP verdict, (2) reducing the dimensionality of the system and (3) revealing interactive features between system inputs. In particular, properties (1) and (3) would help in a further investigation, given that in the intra-BAIOP testing performed in this research work the ranking of the input factors and the identification of the interactions between them are possible only in qualitative terms. The authors are aware of the potentiality of this statistical analysis and its application is envisaged in a future work.
Regarding the IOP testing outcome per se, in this research work an evaluation of the IOP verdict in terms of “pass” or “fail” is considered. However, in order to deepen the system IOP investigation, it can be of interest to define an “interoperability boundary”, i.e., to identify the regions in the system input space where the IOP verdict changes from “success” to “failure”. The same consideration done for the system dimensionality increase applies here, i.e., more advanced statistical tools would be required for this purpose.
For what concerns the UC development, the possibility to include (within the class of UCs similar to the one examined in this research work) market-related considerations which are easily mapped to the business SGAM layer has also emerged. For instance, it might be of interest to consider the idea of a DSO-oriented indicator, which can be set according to the DSO preference taking into account variable scores for quantifying the quality of service. In this case, a business-related system output is worth investigating.
Furthermore, the testing environment adopted in this research work provides the foundation for more sophisticated tests in which DSOs and flexibility providers can assess the IOP of the different actors in achieving the service objectives by: (1) refining the test bed by the inclusion of the indirect flexibility activation mechanism (upper-bound architecture) via a market platform (through aggregators) or via an energy management system considering operational and economical objectives; (2) modification of the test bed to include hardware equipment (PHiL tests including real power hardware components such as storage units, photovoltaic inverter, etc.); (3) combination of (1) and (2).
In short, extension of the analyses in the direction of the statistical AoE, refinement of the maturity level of the testing environment and application of a more detailed DoE procedure are envisaged by the authors as future work.

5. Conclusions

This research work successfully applied the “Smart Grid Interoperability Testing Methodology” proposed by the JRC-SGILab [10] to perform an IOP testing for an SG DMS application, revealing its flexibility and usefulness in supporting the IOP testing in the SG domain.
On one hand, the obtained results are specific to the examined UC and the adopted test bed. However, on the other hand, this research work gives an example of how the breakdown of the IOP problem can be structured by employing a robust methodological approach and highlights which drawbacks may be encountered when similar SG applications are addressed. A detailed DoE reveals to be of paramount importance for effectively being able to assess the performance (under an IOP perspective) of complex systems in SG applications. Moreover, the results show that fine statistical analyses are needed in particular if a detailed investigation of the IOP behavior has to be achieved in terms of importance ranking of the system parameters as well as identification of the interactions between them. Furthermore, the formalized concepts of inter- and intra-BAIOP IOP testing help in effectively reflecting different (but complementary) interests of the SG stakeholders. In addition, the choice of working in an HiL real-time simulation environment has been revealed promising to effectively model and reflect the heterogeneous and complex nature of presently SG applications (where also ICT requirements become of paramount importance beside the grid-related parameters).
In short, this paper has proven the importance of the application of robust and integrated methodologies to effectively tackle the SG IOP testing challenge, setting the scene for promising future extensions within the context of complex SG power systems.

Author Contributions

Conceptualization, methodology, formal analysis, investigation, writing–original draft preparation, M.G. and A.A.; writing–review and supervision, F.P.; supervision, A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank Austrian Institute of Technology staff for providing support to set up the test bed in SmartEST laboratory infrastructure within the scope of the European Union’s Horizon 2020 research and innovation program (H2020/2014-2020) in project “ERIGrid” (Grant Agreement No. 654113) under the Transnational Access (TA) User Project 04.021-2018. Furthermore, the authors appreciate the collaborative support from Trialog staff for the activities related to the identification of telecommunication technologies.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations (in alphabetical order) are used in this manuscript:
AoEAnalysis of Experiments
BAIOPBasic Application Interoperability Profile
BAPBasic Application Profile
DERDistributed Energy Resource
DMSDistribution Management System
DoEDesign of Experiments
DSLDigital Subscriber Line
DSMDemand Side Management
DSODistribution System Operator
ECEuropean Commission
EVElectric Vehicle
FLEXFlexibility source
(P)/(C)-HiL(Power)/(Controller)-Hardware-in-the-Loop
ICTInformation and Communication Technology
IOPInteroperability
JRCJoint Research Centre
LVLow Voltage
PLCPower Line Communication
PMUPhasor Measurement Unit
RFRadio Frequency
RTURemote Terminal Unit
SGSmart Grid
SGAMSmart Grid Architecture Model
SG-CGSmart Grid Coordination Group
SGILabSmart Grid Interoperability Laboratory
SSSmart Sensor
UCUse Case
VPNVirtual Private Network

References

  1. Kotsakis, E.; Fulli, G.; Masera, M. Smart Grid Interoperability lab at the Joint Research Centre (JRC) of the European Commission: Towards a European platform for real time simulation. In Proceedings of the AEIT International Annual Conference, Capri, Italy, 5–7 October 2016; pp. 1–6. [Google Scholar] [CrossRef]
  2. CEN-CENELEC-ETSI Smart Grid Coordination Group. Methodologies to Facilitate Smart Grid System Interoperability Through Standardization, System Design and Testing; CEN-CENELEC: Brussels, Belgium, 2014. [Google Scholar]
  3. CEN-CENELEC-ETSI Smart Grid Coordination Group. Smart Grid Set of Standards, Version 4.1; CEN-CENELEC: Brussels, Belgium, 2017. [Google Scholar]
  4. Ustun, T.S. Interoperability and interchangeability for microgrid protection systems using IEC 61850 standard. In Proceedings of the IEEE International Conference on Power and Energy (PECon), Melaka, Malaysia, 28–30 November 2016; pp. 20–25. [Google Scholar] [CrossRef]
  5. Lin, C.I.; Liao, J.L.; Lin, C.H.; Dong, J.L. Implementation of an IEC 61400-25 based wind power plant SCADA system: Case study: Datan wind power plant of the Taiwan power company. In Proceedings of the 2017 IEEE International Conference on Smart Grid and Smart Cities (ICSGSC), Singapore, 23–26 July 2017; pp. 103–107. [Google Scholar] [CrossRef]
  6. Song, E.Y.; FitzPatrick, G.J.; Lee, K.B. Smart Sensors and Standard-Based Interoperability in Smart Grids. IEEE Sens J. 2017, 17, 7723–7730. [Google Scholar] [CrossRef] [PubMed]
  7. Van Amelsvoort, M.; Delfs, C.; Uslar, M. Application of the interoperability score in the smart grid domain. In Proceedings of the 2015 IEEE 13th International Conference on Industrial Informatics (INDIN), Cambridge, UK, 22–24 July 2015; pp. 442–447. [Google Scholar] [CrossRef]
  8. Ford, H.; Colombi, J.; Graham, S.; Jaques, D. The Interoperability Score. In Proceedings of the 5th Annual Conference on Systems Engineering Research, Hoboken, NJ, USA, 14–16 March 2007; p. 33. [Google Scholar]
  9. Tolk, A.; Muguira, J.A. The Levels of Conceptual Interoperability Model. In Proceedings of the 2003 Fall Simulation Interoperability Workshop, Orlando, FL, USA, 14–19 September 2003. [Google Scholar]
  10. Papaioannou, I.; Tarantola, S.; Lucas, A.; Kotsakis, E.; Marinopoulos, A.; Ginocchi, M.; Masera, M. Smart Grid Interoperability Testing Methodology; Publications Office of the European Union: Luxembourg, 2018; ISBN 978-92-79-96855-6. [Google Scholar] [CrossRef]
  11. Andreadou, N.; Lucas, P.I.; Masera, M. Interoperability Testing Methodology for Smart Grids and Its Application on a DSM Use Case—A Tutorial. Energies 2018, 12, 8. [Google Scholar] [CrossRef] [Green Version]
  12. Andreadou, N.; Lucas, A.; Tarantola, S.; Poursanidis, I. Design of Experiments in the Methodology for Interoperability Testing: Evaluating AMI Message Exchange. Appl. Sci 2019, 9, 1221. [Google Scholar] [CrossRef] [Green Version]
  13. INTERactions between Stakeholders and the Technical and Economic Potential of Local FLEXibilities. Available online: https://interflex-h2020.com/ (accessed on 27 February 2020).
  14. Kupzog, F.; Genest, O.; Ahmadifar, A.; Berthome, F.; Cupelli, M.; Kazmi, J.; Savic, M.; Monti, A. SGAM-based comparative study of interoperability challenges in european flexibility demonstrators: Methodology and results. In Proceedings of the IEEE 16th International Conference on Industrial Informatics (INDIN), Porto, Portugal, 18–20 July 2018; pp. 692–697. [Google Scholar] [CrossRef] [Green Version]
  15. Kazmi, J.; Ahmadifar, A.; Kupzog, F.; Cupelli, M.; Ginocchi, M.; Genest, O.; Calin, M.; Savic, M.; Monti, A. Identification of common services in European Flexibility Demonstrators for Laboratory-based Interoperability Validation. In Proceedings of the 2019 8th International Conference on Renewable Energy Research and Applications (ICRERA), Brasov, Romania, 3–6 November 2019; pp. 857–863. [Google Scholar] [CrossRef]
  16. Chen, N. Passive Interoperability Testing for Communication Protocols; NNT: 2013REN1S046; HAL, Université Rennes 1: Rennes, France, 2013. [Google Scholar]
  17. Song, E.Y.; Lee, K.B.; FitzPatrick, G.J.; Zhang, Y. Interoperability test for IEC 61850-9-2 standard-based merging units. In Proceedings of the 2017 IEEE Power & Energy Society Innovative Smart Grid Technologies Conference (ISGT), Washington, DC, USA, 23–26 April 2017; pp. 1–6. [Google Scholar] [CrossRef]
  18. Miswan, N.S.; Ridwan M., I.; Hayatudin, A.; Aminuddin Musa, I. Interoperability testing for Digital Substation in Smart Grid domain: A power utility perspective. In Proceedings of the 2015 International Symposium on Technology Management and Emerging Technologies (ISTMET), Langkawai, Island, 25–27 August 2015; pp. 154–158. [Google Scholar] [CrossRef]
  19. Hardware-in-the-Loop Testing Applications. Available online: https://www.add2.co.uk/applications/hil/ (accessed on 27 February 2020).
  20. CIGRE Task Force C6.04.02. Benchmark Systems for Network Integration of Renewable and Distributed Energy Resources; E-cigre.org; CIGRE: Paris, France, 2013; ISBN 978-285-873-270-8. [Google Scholar]
  21. Common Open Research Emulator (CORE). Available online: https://www.nrl.navy.mil/itd/ncs/products/core (accessed on 27 February 2020).
  22. D3.7: Interoperability and Interchangeability Validation Results. Available online: https://interflex-h2020.com/results/deliverables/ (accessed on 27 February 2020).
  23. Stahleder, D.; Reihs, D.; Lehfuss, F. Lablink—A novel co-simulation tool for the evaluation of large scale EV penetration focusing on local energy communities. In Proceedings of the CIRED 2018 Workshop on Microgrids and Local Energy Communities, Ljubljana, Slovenia, 7–8 June 2018; ISBN 978-2-9602415-1-8. [Google Scholar]
  24. CEN-CENELEC-ETSI Smart Grid Coordination Group. Sustainable Processes; CEN-CENELEC: Brussels, Belgium, 2012. [Google Scholar]
  25. Markiewicz, H.; Klajn, A. Voltage Disturbances—Standard EN 50160—Voltage Characteristics in Public Distribution Systems; Copper Development Association: Hertfordshire, UK, 2004. [Google Scholar]
  26. CEN-CENELEC-ETSI Smart Grid Coordination Group. Overview of SG-CG Methodologies; CEN-CENELEC: Brussels, Belgium, 2014. [Google Scholar]
  27. RTU2020 Remote Terminal Unit Specification. Available online: https://www.honeywellprocess.com/library/marketing/tech-specs/SC03-300-101-RTU-2020.pdf (accessed on 27 February 2020).
  28. Energy Storage Technologies. Available online: http://www.imperial.ac.uk/grantham/energy-storage/ (accessed on 27 February 2020).
  29. Eid, C.; Codani, P.; Perez, Y.; Reneses, J.; Hakvoort, R. Managing electric flexibility from Distributed Energy Resources: A review of incentives for market design. Renew. Sustain. Energy Rev. 2016, 64, 237–247. [Google Scholar] [CrossRef]
Figure 1. Block diagram of the EC-JRC-SGILab proposed methodology for IOP testing [10].
Figure 1. Block diagram of the EC-JRC-SGILab proposed methodology for IOP testing [10].
Energies 13 01648 g001
Figure 2. Overall view of the test bed setup adopted in this research work (left and middle parts) [15].
Figure 2. Overall view of the test bed setup adopted in this research work (left and middle parts) [15].
Energies 13 01648 g002
Figure 3. Cigre LV feeder power grid used as reference grid model [20].
Figure 3. Cigre LV feeder power grid used as reference grid model [20].
Energies 13 01648 g003
Figure 4. Message sequence chart representing the interactions between the UC actors.
Figure 4. Message sequence chart representing the interactions between the UC actors.
Energies 13 01648 g004
Figure 5. Mapping of the selected UC to the SGAM layers: component, function and communication layers are represented at the same time (on the top). The color convention of the SGAM layers (on the bottom) is maintained [26].
Figure 5. Mapping of the selected UC to the SGAM layers: component, function and communication layers are represented at the same time (on the top). The color convention of the SGAM layers (on the bottom) is maintained [26].
Energies 13 01648 g005
Figure 6. Inter-BAIOP testing—impact of the different considered BAIOPs on the system response, in terms of t r e s , V r e s and outcome of the IOP test.
Figure 6. Inter-BAIOP testing—impact of the different considered BAIOPs on the system response, in terms of t r e s , V r e s and outcome of the IOP test.
Energies 13 01648 g006
Figure 7. Intra-BAIOP testing—impact of the admitted voltage deviation ( A V D ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Figure 7. Intra-BAIOP testing—impact of the admitted voltage deviation ( A V D ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Energies 13 01648 g007
Figure 8. Intra-BAIOP testing—impact of the flexibility capacity ( F l e x C a p ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Figure 8. Intra-BAIOP testing—impact of the flexibility capacity ( F l e x C a p ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Energies 13 01648 g008
Figure 9. Intra-BAIOP testing—impact of flexibility response time ( F l e x R e s p T ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Figure 9. Intra-BAIOP testing—impact of flexibility response time ( F l e x R e s p T ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Energies 13 01648 g009
Figure 10. Intra-BAIOP testing—impact of RTU processing time ( R T U P r o c T ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Figure 10. Intra-BAIOP testing—impact of RTU processing time ( R T U P r o c T ) on the system response (in terms of t r e s and V r e s ) and on the IOP verdict.
Energies 13 01648 g010
Table 1. List of the actors for the selected UC as defined in [24].
Table 1. List of the actors for the selected UC as defined in [24].
Actor NameActor TypeActor DescriptionFurther Information Specific to the UC
SCADA systemApplicationSupervisory Control And Data Acquisition system provides the basic functionality for implementing Energy Management System or Data Management System, especially provides the communication with the substations to monitor and control the gridDSO SCADA here refers to the monitoring actor of the distribution grid and consists of the acquisition and supervisory control units (see Table 2).
Energy Management Gateway (EMG)SystemAn access point (functional entity) sending and receiving smart grid-related information and commands between actor A and the Customer Energy Manager (CEM), letting the CEM decide how to process the events. The communication is often achieved through an internet connection or through a wireless connectionIn this UC, EMG is intended to be the RTU through which the DSO SCADA interfaces the flexibility.
Flexible LoadRoleLoad that can be modulatedIn this UC, the flexible load in the power grid is referred to as the FLEX actor and it is assumed to be present with its full flexible capacity when required
Table 2. Step-by-step analysis of the UC.
Table 2. Step-by-step analysis of the UC.
Step No.EventDescription of Process or ActivityInformation ProducerInformation ReceiverInformation Exchanged
1DSO SCADA data reading and storageDSO SCADA reads nodal voltage of the power grid and stores the last 10 measurementsNode R11 of CIGRE LV feeder grid modelDSO SCADA acquisition unitMeasured voltage
2DSO SCADA data acquisitionDSO SCADA readings are converted into an average valueDSO SCADA acquisition unitDSO SCADA acquisition unitComputation of the average of the last 10 voltage readings
3DSO SCADA logic actuationIf the average of the latest 10 node voltage measurements is lower than a predefined threshold, a flexibility activation command is produced. Otherwise, no activation command is sent outDSO SCADA acquisition unitDSO SCADA supervisory control unitLogical analysis of the measured voltage
4Command to RTUIf flexibility is required, the DSO SCADA sends a flexibility activation command to RTUDSO SCADA supervisory control unitRTUFlexibility activation command
5Command to FLEXRTU sends the flexibility activation command to FLEXRTUFLEXFlexibility activation command
6Flexibility dispatchFLEX injects all its amount of flexibility into R11 nodeFLEXNode R11 of Cigre LV feeder grid modelPower injection
7AcknowledgmentFLEX acknowledges RTU that FLEX has injected all its amount of power according to the flexibility activation commandFLEXRTUFeedback from FLEX
8AcknowledgmentRTU acknowledges DSO SCADA that FLEX has injected all its amount of power according to the flexibility activation commandRTUDSO SCADA supervisory control unitFeedback from RTU
Table 3. List of the BAPs considered for the selected UC, defined according to the considered communication technologies.
Table 3. List of the BAPs considered for the selected UC, defined according to the considered communication technologies.
From ActorTo ActorTechnologyBAP Identifier
DSORTUEthernet xDSL (Digital Subscriber Line) cableBAP 1a
DSORTUMobile networkBAP 1b
DSORTUReal-Time Communication (RTC)BAP 1c
DSORTUNarrow-band Power Line Communication (PLC)/Radio Frequency (RF) MeshBAP 1d
RTUFLEXFiber (home)/Local EthernetBAP 2a
RTUFLEXNarrow-band PLC/RF MeshBAP 2b
Table 4. Definition of the communication parameters specifying the different telecommunication technology options considered in Table 3.
Table 4. Definition of the communication parameters specifying the different telecommunication technology options considered in Table 3.
TechnologyBandwidth (Mbps)Background Traffic (Mbps)Delay ( μ s)Jitter ( μ s)Packet Loss (%)Duplicate (%)
Fiber (home) Local Ethernet100Link dependent3000100000
Ethernet xDSL cable20Link dependent30,00010,00000
Mobile network10Link dependent60,00020,00010
Narrow-band PLC/RF Mesh0.1Link dependent300,000100,00030
RTC0.056Link dependent150,00050,00000
Table 5. List of the BAIOPs considered within the selected UC for the communication layer.
Table 5. List of the BAIOPs considered within the selected UC for the communication layer.
Use CaseBAIOP IdentifierBAPs Identifier
Lower-Bound Voltage SupportBAIOP 1BAP 1a + BAP 2a
BAIOP 2BAP 1b + BAP 2b
BAIOP 3BAP 1a + BAP 2b
Table 6. List of the service-related input factors, together with their statistical characterization. For the Gaussian PDF, the mean ( μ ) and standard deviation ( σ ) are reported, while the m i n and m a x values specify the uniform PDFs.
Table 6. List of the service-related input factors, together with their statistical characterization. For the Gaussian PDF, the mean ( μ ) and standard deviation ( σ ) are reported, while the m i n and m a x values specify the uniform PDFs.
Input Factor NamePDF TypePDF-Specific Parameters
R T U P r o c T (ms)Gaussian μ = 90, σ = 10
A V D (%)Uniform m i n = 2.5, m a x = 7.5
F l e x R e s p T (ms)Uniform m i n = 60, m a x = 80
F l e x C a p (KW)Uniform m i n = 41.8, m a x = 156.75
Table 7. Specification of the DoE procedure in the examined UC.
Table 7. Specification of the DoE procedure in the examined UC.
DoE StepsDescription
Define the goals of the experimentAssess the IOP of the three actors involved in the flexibility activation chain in order to perform the functionality as described in the examined UC, under different operational conditions. Two types of IOP tests are performed, namely inter-BAIOP and intra-BAIOP. See Section 2.3.5.
Identify the system response to be measuredTwo different outputs are measured: restored voltage ( V r e s ) and restoration time ( t r e s ). Equations (1)–(3) specify the criteria used for defining the IOP verdict.
Identify the input factorsCommunication-related input factors: Bandwidth, Background traffic, Delay, Jitter, Packet loss, Duplicate (see Table 4). Service-related input factors: RTUProcT, AVD, FlexRespT, FlexCap (see Table 6).
Identify the intervals of variation of the input factorsTaking into account the chosen BAIOPs, the values of the communication-related input factors are reported in Table 4, while the intervals of variation for the service-related input factors are reported in Table 6.
Sample N values for the inputs within their intervals of variationFor the intra-BAIOP testing, N values are sampled for each service-related input factor within its range of variation.
Table 8. Example of Test Description document for an intra-BAIOP test case.
Table 8. Example of Test Description document for an intra-BAIOP test case.
Test Case IDIntra-BAIOP/TC1
BAIOP ID/UC IDBAIOP1/Lower-Bound Voltage Support
Interoperability layerFunctional
Summary of the testDSO SCADA monitors node voltages of the power grid. One specific node (R11) of the LV-feeder CIGRE grid model is considered and a specific disturbance (causing voltage to drop) is introduced therein. The DSO SCADA logic is actuated in the case that the average of the latest 10 node voltage measurements is lower than the predefined threshold ( A V D ). At this point, the DSO SCADA sends a flexibility activation signal to the RTU which reacts with a delay equal to R T U P r o c T and sends the flexibility activation command to the FLEX available at that node. FLEX, after a time equal to F l e x R e s p T , reacts to the RTU activation signal and injects all its amount of flexibility (equal to F l e x C a p ) into that node.
Test PurposeEvaluate the IOP between all the actors involved in the flexibility activation chain under different operational conditions.
Test Description
Step 1Perform initial measurements to ensure the well-functioning of the communication between the EUT, the power grid model and the network emulator
Step 2Setup the communication parameters for representing the selected BAIOP
Step 3After introducing a fixed disturbance at the specific node, the flexibility activation chain starts operating under a given configuration of service-related input factors (RTUProcT, AVD, FlexRespT, FlexCap)
Step 4Collect the measurement values in terms of V r e s and t r e s
Step 5IOP test verdict:
PASS: if F < AVD
FAIL: otherwise
The value of F is defined as described in Equation (3).

Share and Cite

MDPI and ACS Style

Ginocchi, M.; Ahmadifar, A.; Ponci, F.; Monti, A. Application of a Smart Grid Interoperability Testing Methodology in a Real-Time Hardware-In-The-Loop Testing Environment. Energies 2020, 13, 1648. https://doi.org/10.3390/en13071648

AMA Style

Ginocchi M, Ahmadifar A, Ponci F, Monti A. Application of a Smart Grid Interoperability Testing Methodology in a Real-Time Hardware-In-The-Loop Testing Environment. Energies. 2020; 13(7):1648. https://doi.org/10.3390/en13071648

Chicago/Turabian Style

Ginocchi, Mirko, Amir Ahmadifar, Ferdinanda Ponci, and Antonello Monti. 2020. "Application of a Smart Grid Interoperability Testing Methodology in a Real-Time Hardware-In-The-Loop Testing Environment" Energies 13, no. 7: 1648. https://doi.org/10.3390/en13071648

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop