The rise of algorithmically generated labor market data, particularly “shadow profiles” compiled by firms like Lightcast IO, presents a new ethical frontier for Human Capital Management (HCM). These profiles are constructed without individual consent, often scraped, or inferred from public data, and used to inform hiring, compensation, and workforce planning decisions. Drawing upon legal frameworks such as the GDPR and CCPA, this article explores the risks associated with shadow profiling from an ethical, legal, and strategic business perspective. It argues that HCM professionals who adopt such datasets without transparency or safeguards risk violating core ethical principles of autonomy, fairness, and accountability. The article concludes by offering practical guidelines for ethically responsible data practices in HCM.
Keywords:
Business ethics, data privacy, shadow profiles, human capital management, AI, GDPR, CCPA, consent, workforce analytics
As organizations increasingly adopt AI and data analytics in workforce management, the line between innovation and intrusion blurs. Human Capital Management (HCM) now routinely incorporates data from third-party vendors to gain insights into job markets, talent gaps, and even individual behavior. One prominent player in this space is Lightcast (formerly Emsi Burning Glass), whose labor market analytics rely heavily on “shadow profiles”—digitally constructed representations of individuals inferred from public and semi-public data sources.
These profiles are often created without the data subject’s knowledge or consent and are sold to employers, educational institutions, and government agencies. While these datasets promise efficiency, insight, and precision, they also raise profound ethical questions related to privacy, consent, and discrimination.
This paper examines the ethical dilemmas and legal risks posed by shadow profiling practices in HCM and argues for a framework of transparency, accountability, and human dignity in the use of such data.
Shadow profiles refer to digital profiles created without the individual’s direct input or awareness, built from scraped public data (e.g., resumes, LinkedIn, job boards), social media activity, and inferred characteristics. Lightcast claims to aggregate more than one billion job and education profiles globally (Lightcast, 2023), which are used for workforce planning, job market forecasting, and talent acquisition.
These profiles may include inferred job transitions, salary predictions, and even reskilling recommendations—offered as data-as-a-service to employers. While such tools are marketed as anonymized and aggregated, the granularity of the data often allows for re-identification, creating a pseudonymous tracking system that users cannot see, contest, or escape.
The collection and monetization of individual data without consent violate foundational ethical principles:
The ethical principle of autonomy requires that individuals have control over information about themselves. Shadow profiling bypasses this control by collecting and commodifying data invisibly.
The ethical use of data demands that individuals know when their data is being collected, what it is being used for, and have the ability to opt out. Lightcast and similar companies do not obtain informed consent, nor do they offer meaningful avenues for redress or correction.
AI-driven workforce tools that rely on incomplete or biased datasets may entrench historical discrimination. Predictive analytics can reproduce disparities in pay, opportunity, and performance evaluation, raising issues of procedural and distributive justice (O’Neil, 2016).
Under the GDPR, data controllers must obtain consent for processing personal data (Art. 6) and offer data subjects the right to access, rectify, and erase their data (Art. 15–17). Automated decision-making, including profiling, requires additional safeguards (Art. 22).
Given the lack of consent or subject access mechanisms, shadow profiling arguably violates several GDPR provisions.
CCPA extends similar protections, allowing California residents to access personal data held by third parties and to opt out of its sale. Lightcast’s operations may fall under CCPA’s scope if data is used in employee-related decisions for California residents.
HCM professionals who use shadow profiles to inform hiring or workforce decisions may share liability if the underlying data was obtained unlawfully or used discriminatorily (Federal Trade Commission, 2022). The concept of data due diligence is becoming a central concern in HR compliance (Sloan & Warner, 2021).
Universities have increasingly adopted Lightcast’s tools to assess graduate employability, redesign curricula, and track alumni success. However, students are often unaware of the data tracking involved. This raises questions of academic transparency and student autonomy, particularly when decisions affecting curriculum or career advising are made on the basis of opaque, algorithmically generated profiles.
This echoes controversies in the UK and Australia, where predictive analytics in education triggered regulatory scrutiny and public backlash (Williamson, 2020).
In light of the issues raised, the following are proposed as minimum ethical safeguards:
Shadow profiles promise to revolutionize Human Capital Management through data-driven insights—but not without cost. By circumventing principles of transparency and consent, they threaten the very ethical foundations of employee trust and autonomy. For HCM professionals, the ethical path forward lies not in technological inevitability, but in deliberate, values-based decision-making that centers human dignity.
References
Federal Trade Commission. (2022, March). WW International, Inc. settlement. https://www.ftc.gov/news-events/news/press-releases/2022/03/ftc-finalizes-settlement-agreement-ww-international-inc
Lightcast. (2023). Global labor market data. https://lightcast.io/data
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown Publishing Group.
Sloan, R. H., & Warner, R. (2021). Cybersecurity and privacy law. Aspen Publishing.
Williamson, B. (2020). Big data in education: The digital future of learning, policy and practice. SAGE Publications.