The tracking isn’t the problem. The secrecy is.

In early April 2026, a European association of LinkedIn users called Fairlinked e.V. published a report claiming that LinkedIn injects a large JavaScript bundle – internally referred to as “Spectroscopy” – into every page load for Chromium-based browser users. The alleged purpose: to scan for the presence of over 6,000 browser extensions, collect 48 hardware and software characteristics, and transmit the resulting device fingerprint to LinkedIn’s servers and, according to the report, to at least one third-party cybersecurity company.

The findings were independently verified by BleepingComputer and Tom’s Hardware. LinkedIn’s response was that the scanning is a security measure against data scrapers, not surveillance.

Maybe. But that framing sidesteps the more uncomfortable question.


LinkedIn’s privacy policy contains no mention of extension scanning.

That is not a technicality. Under GDPR Articles 13 and 14, organisations are required to disclose data processing at the point of collection. LinkedIn knows this terrain well – they settled a €310 million fine with the Irish Data Protection Commission in October 2024 for failing to meet GDPR requirements around behavioural advertising. The Deputy Commissioner stated at the time: “The lawfulness of processing is a fundamental aspect of data protection law, and the processing of personal data without an appropriate legal basis is a clear and serious violation of a data subject’s fundamental right to data protection.”

BrowserGate, if the allegations hold, looks like the same pattern applied to a different type of data collection. LinkedIn’s response to the DPC’s 2024 decision was that they “believed they were in compliance with GDPR.” Apparently the belief extended further than the facts did.


Compare this to Meta, which is probably the most surveilled company in the world by popular reputation.

Meta’s privacy policy explicitly discloses that it tracks users across the web via the Meta Pixel – a tracking tag present on an estimated 30% of the most popular websites globally. It names the tool. It describes the commercial purpose. It provides an opt-out mechanism. The scale of Meta’s behavioural data collection is likely larger than LinkedIn’s. The ethics are debatable. But the disclosure is there.

This is not an argument that Meta’s practices are acceptable. It is an argument that the LinkedIn case is specifically about transparency, not about tracking per se. The problem is probably not that LinkedIn watched. The problem is that LinkedIn watched without saying so.


Shoshana Zuboff, in The Age of Surveillance Capitalism, describes this pattern as “the unilateral claiming of private human experience as free raw material for translation into behavioural data.” The word that matters is “unilateral.” The extraction happens without the subject’s knowledge or meaningful participation. What LinkedIn allegedly did with extension data is a precise instantiation of this – not incidentally, on a platform where people store professional identity data they tend to consider separate from the broader consumer web.

There is something particularly uncomfortable about it happening on LinkedIn. People make different assumptions there than they do on consumer social platforms. They may use extensions related to sensitive job searches, health conditions, religious practices, or political views – categories that qualify as special-category data under GDPR Article 9. Whether LinkedIn inferred anything from that data is unknown. That it allegedly collected it without disclosure is, at this point, documented.


This raises a question that feels closer to home for us.

GFoundry collects behavioural data. When an employee engages with a learning module, completes a challenge, earns peer recognition, or returns to content voluntarily, that behaviour is captured and used to personalise their experience and inform the organisation’s understanding of engagement and capability. We also connect to internal HR data – performance records, onboarding flows, organisational structure.

We think there is a meaningful difference between what LinkedIn allegedly did and what a workplace platform does – but we hold that view with some humility, because the line is not always obvious.

The difference we believe matters: context, consent, and purpose.

GFoundry operates inside a work context that employees have explicitly entered. The data collected is bounded to that context – it does not reach into the browser, the personal device, or spaces outside the platform. The purpose is stated: talent development, engagement, personalised learning. The employer, as the GFoundry client, has a legal relationship with the employee that includes data processing obligations under GDPR – which means the processing has a disclosed legal basis.

That is not a complete answer to the ethics of workplace behavioural data. It is a starting point. Questions about who ultimately owns the data, whether employees can meaningfully consent within an employment relationship, and what inferences can legitimately be drawn from behavioural signals are genuinely open. A 2024 study in the Journal of Survey Statistics and Methodology found that only 41% of LinkedIn account holders consented to data linkage when asked directly – suggesting professional platform users have meaningfully stricter consent expectations than the platforms themselves tend to assume.

We do not have clean answers to those questions. We think about them, because the alternative is to stop thinking about them – and that is probably how you end up injecting a fingerprinting script into a page load and calling it a security measure.


The LinkedIn story will probably resolve the way these things do: a regulatory investigation, a fine, some policy changes, and a gradual return to normal. What it probably will not resolve is the underlying question about behavioural data in professional contexts.

That question is worth sitting with: whose behaviour is it, what can legitimately be inferred from it, and what do you owe someone when you collect it?

LinkedIn’s answer, apparently, was: not much disclosure required. We think the answer is probably more complicated than that. And more consequential.