Skip to main content
Back to top
Page banner
Image
Knowledge Base Visual Header

Data risks

In the increasingly data-driven landscape of modern society, data risks have emerged as critical concerns demanding careful ethical consideration and systematic management. Risk refers to the possibility of negative outcomes arising from the collection, processing, and utilization of information. These risks encompass a spectrum of potential harms — privacy violations, algorithmic discrimination, behavior manipulation, and threats to individual autonomy — all carrying profound implications for individuals and communities, particularly when data-driven systems directly impact citizens' lives in public administration settings.

Image
Knowledge Base Visual Header
Main content

From an ethical perspective, data risks can be examined through multiple philosophical lenses. A consequentialist approach evaluates risks by focusing on outcomes, emphasizing harm minimization and equitable benefit distribution. This perspective prompts thorough analysis of how data systems influence individuals, communities, and societal structures. Complementing this, Kantian ethics centers on respecting individuals' inherent dignity and autonomy, advocating that people should never be treated merely as means to organizational or technological ends.

The concept of the infosphere, developed by philosopher Luciano Floridi, provides additional insight by representing the interconnected environment where physical and digital realities converge. Within this infosphere, individuals function as informational organisms interacting with data systems that increasingly shape their autonomy, identity, and societal roles. This integration significantly amplifies ethical stakes, as risks affect not only individual users but broader societal structures and values.

Privacy represents a foundational concern, extending beyond mere confidentiality to encompass informational autonomy — individuals' ability to maintain control over personal data and how it shapes their identity. When organizations deploy technologies like facial recognition or integrate sensitive data into centralized databases, they often blur boundaries between public and private life, potentially undermining citizens' capacity to manage informational boundaries.

Algorithmic bias presents another significant risk dimension, occurring when historical inequities, societal stereotypes, or incomplete datasets influence algorithmic outputs, leading to discriminatory outcomes. High-stakes decisions — determining social benefits eligibility or identifying law enforcement targets — can perpetuate and amplify existing injustices when algorithms inherit biases from training data or design parameters.

The risks to personal autonomy through manipulation represent a third critical concern. Systems employing predictive analytics and behavioral targeting can undermine individual decision-making capacity by influencing behavior in subtle yet significant ways. These systems, designed to nudge individuals toward specific actions often without awareness or informed consent, raise profound questions about transparency and respect for human agency.

Addressing these multifaceted risks requires comprehensive governance frameworks integrating ethical principles into design, deployment, and management of data systems. While regulations like GDPR provide baseline protections, their effectiveness depends on thoughtful implementation that embraces what Floridi calls "soft ethics" — proactively embedding ethical values prioritizing human dignity, fairness, and autonomy in data practices.
 

D4A Publications on data risks

References to academic publications

Behrendt, F., & Sheller, M. (2024). Mobility data justice.

Belen-Saglam, R., Yuan, H., Heering, M. S., Ashraf, R., & Li, S. (2025). A systematic literature review on cyber security and privacy risks in MaaS (Mobility-as-a-Service) systems.

Cagnazzo, C. (2021). The thin border between individual and collective ethics: The downside of GDPR.

Cao, M., Zhu, H., Min, M., Li, Y., Li, S., Zhang, H., & Han, Z. (2024). Protecting personalized trajectory with differential privacy under temporal correlations.

Crusoe, J., Simonofski, A., Clarinval, A., & Gebka, E. (2019). The impact of impediments on open government data use: Insights from users.

Cunningham, T., Cormode, G., Ferhatosmanoglu, H., & Srivastava, D. (2021). Real-world trajectory sharing with local differential privacy.

Du, Y., Hu, Y., Zhang, Z., Fang, Z., Chen, L., Zheng, B., & Gao, Y. (2024). Real-time trajectory synthesis with local differential privacy.

Dürr, C., & Gühring, G. S. (2025). A combined approach of heat map confusion and local differential privacy for the anonymization of mobility data.

Garroussi, Z., Legrain, A., Gambs, S., Gautrais, V., & Sansò, B. (2025). A systematic review of data privacy in mobility as a service (MaaS).

Gebka, E., & Castiaux, A. (2021). A typology of municipalities' roles and expected users' roles in open government data release and reuse.

Harris, D., Samuel, S., & Probert, E. (2018). GDPR confusion.

Heering, M. S., Yuan, H., & Li, S. (2025). Do security and privacy attitudes and concerns affect travellers’ willingness to use Mobility-as-a-Service (MaaS) systems?

Hsu, I.-J., Lin, C.-H., Yu, C.-M., Kuo, S.-Y., & Huang, C.-Y. (2025). Poisoning attacks to local differential privacy protocols for trajectory data.

Kong, X., Chen, Q., Hou, M., Wang, H., & Xia, F. (2024). Mobility trajectory generation: A survey.

Li, Q., Wu, H., & Dong, C. (2023). A privacy-preserving ride matching scheme for ride sharing services in a hot spot area.

Miranda-Pascual, À., Guerra-Balboa, P., Parra-Arnau, J., Strufe, T., & Forné, J. (2023). SoK: Differentially private publication of trajectory data.

Nelson, T. A., Goodchild, M. F., & Wright, D. J. (2022). Accelerating ethics, empathy, and equity in geographic information science.

Peloquin, D., DiMaio, M., Bierer, B., & Barnes, M. (2020). Disruptive and avoidable: GDPR challenges to secondary research uses of data.

Purwanto, A., Zuiderwijk, A., & Janssen, M. (2020). Citizens’ trust in open government data.

Sieg, L., Gibbs, H., Gibin, M., & Cheshire, J. (2024). Ethical challenges arising from the mapping of mobile phone location data.

Staunton, C., Slokenberga, S., & Mascalzoni, D. (2019). The GDPR and the research exemption: Considerations on the necessary safeguards for research biobanks.

Van de Vyvere, B., & Colpaert, P. (2022). Using ANPR data to create an anonymized linked open dataset on urban bustle.

Wang, J., Lin, Y., & Li, Y. (2025). GTG: Generalizable trajectory generation model for urban mobility.

Wang, T., Zhang, X., Feng, J., & Yang, X. (2020). A comprehensive survey on local differential privacy toward data statistics and analysis.

Yao, A., Madden, M., Buckley, A., Delmelle, E., & Sinha, G. (2022). Bringing ethics to cartography and geographic information science: AutoCarto 2022.

Zhang, W., Xie, Z., Maradapu Vera Venkata Sai, A., Zia, Q., He, Z., & Yin, G. (2023). A local differential privacy trajectory protection method based on temporal and spatial restrictions for staying detection.

Zhang, Y., Ye, Q., Chen, R., Hu, H., & Han, Q. (2023). Trajectory data collection with local differential privacy.

Zhu, W., Jiang, R., Li, D., Kong, X., & Song, X. (2025). Trajectory generative models: A survey from unconditional and conditional perspectives.