Privacy Impact in Systems Engineering
In my professional field, I work as a systems engineer for transportation systems, the role of a systems engineer is to design, integrate and manage complex systems over their life cycles (IEEE Computer Society 2011), and utilise system thinking principles to organise the body of knowledge – this usually involves project requirements, interfaces and compliance data. The purpose of this data management/collection in a systems engineering scenario is to monitor and analyse the performance metrics of a complex engineering system, such as an aircraft engine, to optimise its operation, maintenance, and performance (Blanchette 2019).
Data is collected through various design documentation from different aspects of the engineering system, capturing parameters like hardware and software information, as well as internal and external subsystem interfaces, in addition to live datasets collected from system operation. The collected data is stored in a secure, centralised database with additional backups to ensure data integrity and availability. Engineers and analysts utilise the data for interface management, change management, predictive maintenance, anomaly detection, and performance optimisation through advanced analytics and machine learning algorithms. All modifications of database information should comply with the specified change management policies, and all actions should be traceable with formalised documentation and approval of relevant stakeholders. The common Key Stakeholders are identified, and their agency and relationship are categorised in Table 1. Key Stakeholders.
The data volume is significant, especially with data on asset design and Test & Validation. The variety includes structured sensor data and unstructured data such as maintenance logs and service reports. The sensitivity of the data is high, as it includes critical performance metrics that directly impact the safety and reliability of the engineering system.
II. IMPACTS
A data impact assessment is needed to identify potential risks associated with data collection, storage, and usage, especially regarding data integrity, security, and privacy. A detailed privacy risk impact assessment is conducted in accordance with The Australian Privacy Principles from Schedule 1 of the Privacy Amendment (Enhancing Privacy Protection) Act 2012 published by the Office of the Australian Information Commissioner (OAIC 2014) with Data Protection Impact Assessment (DPIA) template taken from International Association of Privacy Professionals (IAPP 2024).
Based on the DPIA conducted in Appendix A – DPIA the main impacts of data collection, use and storage can be categorised as follows:
1. Security of Information:
• Unauthorised Data Access: This could lead to compromised sensitive information, potentially causing legal and reputation damage.
• Duplication of System Databases: Inconsistent and inaccurate data across databases can create issues during system maintenance and interfacing.
• Existing Network Configuration Data Inaccuracy: This can affect the correct functioning of system development.
• Interfacing Database Compatibility: Incompatible databases can lead to operational failures.
• System Downtime: This can hinder emergency response, cause unnecessary service disruptions, and affect overall system functionality.
2. Quality of Information:
• Live-feed Data Inaccuracies or Anomalies: This can lead to unreliable service and potential safety hazards.
While the mitigation measures outlined in the DPIA address the identified risks some unintended or unwanted impacts remain partially mitigated. Residual risks still exist for unauthorised data access, the likelihood of unauthorised data access is high due to the complexity of the system and the potential for malicious actors to exploit vulnerabilities (Tene & Polonetsky 2012). The increasing frequency and sophistication of cyber-attacks targeting transportation systems (Boyd & Crawford 2012), highlight the importance of continuous vigilance and access control reviews. Additionally, there are limitations in detecting all live-feed data inaccuracies and anomalies. This risk affects stakeholders involved in data analysis and decision-making, inaccurate or anomalous data can lead to erroneous decision-making, affecting system reliability and safety. This requires ongoing monitoring and improvement of data validation processes. Furthermore, the severity of system downtime is significant, it can disrupt operations, leading to financial losses, inconvenience for users, and potential safety risks. Even with mitigation efforts, downtime cannot be entirely eliminated.
III. ETHICAL AND REGULATORY CONCERNS
In the realm of transportation system engineering, the collection and utilisation of data entail various ethical and regulatory considerations (Parnas & Clements 1986). The collection of sensitive performance data raises privacy concerns regarding individuals' personal information. This includes concerns about unauthorised access or misuse of data during the process of predictive maintenance and system optimisation (Scarfone 2010). This is further supported by a strong public interest in safe and reliable transportation systems. Any inaccuracies or anomalies in the collected data could jeopardise public safety while undermining trust in the system. From a regulatory standpoint, compliance with data security regulations plays a major role in protecting important information from hacking or unauthorised access. In addition to this, it’s essential to maintain integrity and accuracy of collected data for regulatory compliance, system operation and maintaining public trust.
In alignment with OAIC 2014, The Australian Privacy Principles, the following regulatory impacts are identified in Appendix A – DPIA:
• App 3 - Collection of personal information: Organisations must collect personal information only if it is necessary for their functions and activities (OAIC 2014).
• App 10 - Quality of personal information: Orginsation has the responsibility to ensure that the personal information that the entity collects is accurate, up-to-date and complete (OAIC 2014).
• App 11 - Security of personal information: Organisations must take reasonable steps to protect personal information from misuse, interference, loss, unauthorised access, modification or disclosure (OAIC 2014).
• App 12 - Access to personal information: Individuals have the right to access their personal information (OAIC 2014).
• App 13 - Correction of personal information: Individuals have the right to have their information corrected if it is inaccurate (OAIC 2014).
Apart from the regulatory concerns identified, there are some other privacy and ethical concerns to consider. Dfor instance, data collection practices should be mindful of individuals’ privacy by collecting minimal amount of information possible, and have clear policies for retention and deletion of data (Verhulst et al. 2019). For transparency purposes, it is important that individuals know how their data is being used and stored and can opt out. This also includes addressing algorithmic biases and unequal impacts on different user groups. Public perception may see this as surveillance rather than mere collection of information. Moreover, there are ethical considerations that go beyond privacy such as ensuring fairness and accountability in data driven decisions; maintaining human oversight over automated systems; possible job displacement within the transportation sector due to automation (ISO 2021).
IV. RISK PROFILE
Identified risks are associated with the collection, storage, and usage of data in transportation system engineering. The major risks can be summarised into 3 categories:
• Loss of confidentiality
• Loss of integrity
• Loss of availability
Identified risks and affected stakeholders are listed and categorised in Table 2 based on Appendix B – Risk Profile Configuration.
To alleviate these risks, there are several strategies that can be used as detailed in Appendix A – DPIA’s mitigation measures column. To prevent unauthorised interception or access, data should be encrypted when it is in use and stored (Sohail, Sharma & Ciric 2018). Regular audits and assessments should be conducted to ensure compliance with privacy regulations and data security standards. Relationships with suppliers and vendors should be strengthened to ensure timely delivery of project components. In case of system down times maintain a critical backup repository to reduce lead time. Training for stakeholders on data privacy principles and best practices for handling data can promote sensitivity and compliance. The culture of continuous improvement within the organisation should be enhanced through routine reviews of data security incidents. Identify root causes and executing corrective actions will resolve underlying issues and prevent recurrence. Regular inspections, preventive maintenance with effective audits carried out by competent personnel are all necessary components in preventing system failure and address potential issues before they escalate into downtime-causing failures.
Moreover, this kind of approach will also enable organisations to detect inaccuracies or anomalies within the stored information through predictive maintenance techniques using advanced analytics or machine learning algorithms. By looking at historical data which shows patterns indicating possible upcoming failures, maintenance activities can be planned based on this; thereby preventing downtime through proactive measures. Lastly, there should be a comprehensive emergency response plan that outlines all actions to be taken if these risks do happen. Also make sure all personnel are trained on the emergency response procedures and conduct regular drills to test the effectiveness of the plan.
By implementing these strategies, organisations can effectively reduce the likelihood and severity of data management caused incidents, ensuring the continued reliability and availability of transportation systems.
V. CONCLUSION & RECOMMENDATIONS
Engineering transportation systems is ethically and legally sensitive. This information entails the collection, use, and protection of data that raises pertinent privacy concerns involving confidentiality and appropriate usage of personal data (Helbing 2015). It is also vital to detect these quickly since transportation safety and reliability are under public spotlight, thereby indicating that accurate or anomalous data ought to be identified.
Regulatory compliance requires observance in relation to security of data. Failure to comply with this can have severe consequences such as lawsuits, financial penalties on top of damage to an organisation’s reputation. Importantly, maintenance of regulatory compliance and trust among the public depends on maintaining the integrity and accuracy of data collected.
A comprehensive risk analysis should identify potential hazards for mitigating them when it comes to management of information in transportation system engineering. Several risks exist including unauthorised access, breaches into databases, unfounded or exceptional information about the users, non-compliance with privacy laws/standards/regulations, or system failure which pose significant threats warranting preemptive measures for stakeholders.
In order to deal with these risks effectively, strong access control mechanisms as well as encryption protocols should be adopted for the protection of sensitive data. It is therefore important that regular audits and assessments are conducted to ensure that privacy regulations and data security obligations are met. Furthermore, a culture of continuous improvement should be fostered in the organisation in order to identify vulnerabilities and explore them.
To conclude, it is necessary to prioritise data security and integrity since they can minimise the risks involved in transportation system engineering management. Robust mitigation strategies together with compliance culture and continuous improvement initiatives can ably assist businesses in negotiating through intricate data management terrains while ensuring transportation systems remain both intact and reliable.
VI. REFERENCES
Barocas, S & Selbst, A 2016, Big Data's Disparate Impact. California Law Review, vol.104, no.3, pp.671-732.
Blanchette, J. A. 2019, System Engineering Management, Wiley.
Boyd, D & Crawford, K 2012, ‘Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon’, Information, Communication & Society, vol. 15, no. 5, pp.662-679.
Fang, H. & Zhang, Z. & Wang, C. J. & Daneshmand, M. & Wang, C. & Wang, H. 2015, ‘A survey of big data research’, IEEE Network, vol. 29, no. 5, pp. 6-9, doi: 10.1109/MNET.2015.7293298.
IAPP 2024, Template for Data Protection Impact Assessment (DPIA), International Association of Privacy Professionals, accessed 15 April 2014, < https://iapp.org/resources/article/template-for-data-protection-impact-assessment-dpia/>.
IEEE Computer Society 2011, Guide to the Software Engineering Body of Knowledge, IEEE Press.
ISA 2020, Quick Start Guide: An Overview of ISA/IEC 62443 Standards - Security of Industrial Automation and Control Systems, Global Cybersecurity Alliance, accessed 15 April 2014, <https://gca.isa.org/hubfs/ISAGCA%20Quick%20Start%20Guide%20FINAL.pdf>.
OAIC 2014, The Australian Privacy Principles - From Schedule 1 of the Privacy Amendment (Enhancing Privacy Protection) Act 2012, Office of the Australian Information Commissioner, accessed 15 April 2014, <https://www.oaic.gov.au/__data/assets/pdf_file/0006/2004/the-australian-privacy-principles.pdf >.
OAIC 2020, Guide to undertaking privacy impact assessments, Office of the Australian Information Commissioner, accessed 15 April 2014, <https://www.oaic.gov.au/__data/assets/pdf_file/0013/2074/guide-to-undertaking-privacy-impact-assessments.pdf >.
Parnas, D. L. & Clements, P. C. 1986, ‘A Rational Design Process: How and Why to Fake It’, IEEE Transactions on Software Engineering, vol. 12, no.2, pp. 251-257.
Scarfone, K 2010, ‘Guide to Protecting the Confidentiality of Personally Identifiable Information (PII)’, NIST Special Publication 800-122.
Tene, O & Polonetsky, J 2012, ‘Big Data for All: Privacy and User Control in the Age of Analytics’, Northwestern Journal of Technology and Intellectual Property, vol. 11, no. 5, pp.239-274.
Helbing, D 2015, Thinking Ahead - Essays on Big Data, Digital Revolution, and Participatory Market Society, Imprint Springer/Springer International Publishing.
ISO 2021, Information technology — Artificial intelligence (AI) — Bias in AI systems and AI aided decision making, ISO/IEC.
Sohail, O, Sharma, P & Ciric, B 2018, Data governance for next-generation platforms, Deloitte.
Verhulst, SG et al. 2019, Leveraging Private Data for Public Good: A Descriptive Analysis and Typology of Existing Practices, The GovLab at New York University Tandon School of Engineering.