Open-source intelligence (OSINT) claims to offer all-seeing insight without invading privacy, the ability to track and understand people’s behaviour through the traces they knowingly or unknowingly leave online. Law-enforcement agencies use it for national security, corporations for compliance, journalists for exposure, and activists for accountability. The logic appears unassailable: if the information is already public, why not use it?

Yet this logic collapses under the Kenya Data Protection Act (KDPA). The KDPA anchors Kenya’s constitutional right to privacy. It subjects any processing of personal data, manual or automated, private or governmental to principles of lawfulness, fairness, transparency, purpose limitation, and data minimisation. It does not create an exception for data simply because it is accessible. The Act’s insistence that privacy attaches to the person, not to the platform, unsettles OSINT’s foundational assumption that the public domain is a legal free zone.

A crucial provision, Section 28, clarifies the limited circumstances under which personal data may be collected indirectly. The rule is that data must be collected directly from the data subject. The exceptions are narrow: (a) when the data is contained in a public record; or (b) when the data subject has deliberately made the data public. These clauses are often invoked by OSINT practitioners as a blanket licence to harvest information from any open source. But they are, in truth, carefully worded safety valves, not permissions for unrestrained data mining.

“Public record” refers to data lawfully held and accessible under statute such as land registries, company records, or gazette notices. It does not extend to the informal sprawl of social media, blogs, or leaked databases. Nor does “deliberately made public” mean any act of online visibility. A tweet posted to friends, a Facebook photo shared under limited settings, or a comment on a community forum may be publicly accessible but not necessarily a deliberate waiver of privacy. The intent and context of publication matter. The KDPA thus embeds a presumption of restraint: the mere fact of exposure does not extinguish the rights of the data subject.

This interpretive subtlety is precisely where OSINT begins to falter. Its analytical strength lies in aggregation, the ability to combine fragments of open information into coherent profiles. But in doing so, it transforms scattered traces into structured, inferential knowledge about real people. That act of synthesis constitutes processing within the meaning of the KDPA. Once processing occurs, the collector assumes all the attendant obligations: to identify a lawful basis, notify data subjects where practicable, ensure accuracy, and delete or anonymise data when no longer necessary.

The tension becomes evident in Kenya’s national-security operations. Section 51 of the KDPA exempts processing for national security and public order yet offers no procedural guidance or oversight. OSINT tools employed by security agencies, social-media monitoring, geolocation tracking, behavioural analytics, operate in a legal fog where the line between investigation and surveillance blurs. The constitutional promise of privacy under Article 31 risks erosion not through overt violation but through quiet, algorithmic observation shielded by statutory ambiguity.

The corporate sphere fares little better. OSINT has become standard practice in due diligence, recruitment, and reputation management. These activities rely heavily on data that appears “public.” A bank scanning a customer’s digital footprint, a law firm vetting a prospective director, or a consultancy tracking activists online all depend on indirect collection. Section 28’s exceptions may seem to authorise this, but only within reason. The controller must still prove that the processing is proportionate, that the data was indeed public by deliberate choice or legal mandate, and that the data subject’s rights are not overridden. Few institutions meet this threshold in practice.

The weaknesses of OSINT make it even harder to stay compliant. Open data is often missing important pieces, taken out of context, or deliberately distorted. Algorithms can increase bias and treat assumptions as facts. Under the KDPA, the principles of accuracy and fairness require anyone using such data to question it carefully before making decisions. In this way, the law is not just about following procedures, it is about how we understand and test knowledge itself. It reminds data processors that in the digital age, just because something is visible does not mean it is true.

Artificial intelligence has magnified both the promise and peril of OSINT. Automated scraping and facial-recognition systems now process millions of data points in seconds, generating behavioural inferences of unsettling precision. These tools blur the boundaries between observation and prediction, between information and surveillance. The KDPA restricts automated decision-making that produces significant personal effects, but enforcement mechanisms remain weak. Cross-border data flows compound the problem: many OSINT platforms rely on servers or data brokers outside Kenya, raising questions on international transfers and adequacy safeguards.

The deeper issue is philosophical. Can a democracy committed to transparency also preserve the sanctity of privacy? Section 28 of the KDPA offers a tentative answer, it allows limited indirect collection but refuses to collapse privacy into publicity. It recognises that openness and autonomy are not opposites but counterweights. Public information may exist, but its ethical and lawful use requires purpose, necessity, and proportionality. The fact that data can be seen does not mean it can be exploited.

A culture of scepticism, not enthusiasm, must therefore govern OSINT in Kenya. Regulators, investigators, journalists, and corporations alike should view Section 28 as a constraint, not a loophole. The Office of the Data Protection Commissioner should issue interpretive guidance clarifying what constitutes “deliberately made public” and establishing accountability for misuse of open data. Without such boundaries, the republic risks normalising surveillance under the guise of openness.

In the end, OSINT and the KDPA embody a modern paradox: the more transparent society becomes, the more invisible the watcher. Kenya’s challenge is to ensure that intelligence gathering, however open-sourced, does not corrode the dignity it claims to defend. Section 28 reminds us that even in the age of radical visibility, the human person remains the rightful owner of their story.