Privacy Checkup: Find out how to improve your online privacy

Learn more

Chat Apps, Government Ties, and Transparency

· Deutsche Version
Chat Apps, Government Ties, and Transparency

Over the past days, two popular chat services have accused each other of having undisclosed government ties. According to Signal president Meredith Whittaker, Telegram is not only “notoriously insecure” but also “routinely cooperates with governments behind the scenes.” Telegram founder Pavel Durov, on the other hand, claims that “the US government spent $3 million to build Signal’s encryption” and Signal’s current leaders are “activists used by the US state department for regime change abroad.”

We usually try to steer clear of industry gossip and social media disputes, but the current debate is a great opportunity to highlight some interesting points that are widely overlooked and may be of interest to any user of any messaging platform.

Why is this even relevant?

Some of the accusations seem to be so far-fetched and outlandish that most people probably dismiss them as conspiracy theory or FUD tactic right away. And while it’s certainly wise to remain skeptical towards implausible claims, it’s still worth bearing in mind that there are, in fact, multiple instances where supposedly secure communication services were infiltrated or run by government agencies without users noticing it. For example, ANOM, Crypto AG, and EncroChat.

Background

Referring to an article about Signal board member (and new NPR CEO) Katherine Maher and her alleged government ties, Elon Musk tweeted that “there are known vulnerabilities with Signal that are not being addressed.” Meredith Whittaker (along with many others) took this statement literally and promptly denied it. Then, Pavel Durov chimed in to raise further allegations against Signal, to which Whittaker responded by voicing her concerns about Telegram.

Taking Security out of the Equation

As anyone familiar with secure communication should be able to tell, Telegram cannot be considered secure by current industry standards – not by a long shot. It’s primarily a cloud messenger, meaning that messages are permanently stored on a server, aren’t end-to-end encrypted, and could be read by Telegram at any time. It’s only possible to enable end-to-end encryption in single chats.

Signal, on the other hand, is widely respected for its cryptography (and was the second cross-platform messaging app to offer consistent end-to-end encryption, after Threema). Yet, Durov claims that Signal messages of “important people” he’s spoken to had been exploited in US courts or media (probably referring to Tucker Carlson, who recently interviewed Durov and earlier had stated that the NSA broke into his Signal account).

However, such stories are floating around the Internet about almost any secure chat app. In those cases that are actually true, authorities were most likely able to gain physical access to some mobile device, which could also happen to be the device of one of the target’s chat partners. In high-profile cases, it is, of course, also possible that the target’s device was infected with spyware on the OS level, in which case the whole device is compromised, and the security of any app running on it (including Signal, Telegram, and Threema) goes out the window. Durov’s claims regarding Signal’s security should, therefore, not be taken too seriously. He does, however, bring up two other points worth mentioning.

First, he states that “the exact same encryption [as Signal’s] is implemented in WhatsApp, Facebook Messenger, Google Messages and even Skype.” Proposing that the reason for this might be that “big tech in the US is not allowed to build its own encryption protocols that would be independent of government interference” is clearly a FUD tactic. However, it’s still notable that iMessage, which Durov forgot to mention, is the only mainstream messaging app developed in the US not relying on Signal’s encryption. And while it’s not advisable to “roll one’s own crypto,” monoculture probably isn’t ideal, either.

Second, Durov mentions that in contrast to Signal, Telegram offers reproducible builds on both platforms, Android and iOS. That’s true. Signal (like Threema) currently only offers reproducible builds on Android. Due to restrictions on Apple’s part, it’s not possible to support reproducible builds in a straightforward and completely satisfactory way on iOS. However, you have to give Telegram props for going the extra mile and providing at least the best available solution.

Comparing Messaging Services

Of course, Telegram’s reproducible builds for iOS are a small benefit compared to Signal’s full end-to-end encryption, and switching from Signal to Telegram for security reasons (as some people apparently do) is clearly a mistake.

However, when it comes to comparing messaging services, cases are rarely as clear cut as in this instance. As any messenger comparison goes to show, there is a wide range of aspects that factor into the overall security and privacy protection of a communication service. And even the most comprehensive comparison can only list a comparatively small selection of relevant aspects.

What’s more, the combination of certain factors is what really makes a difference in practice. For example, reproducible builds suddenly play a very important role if a service is based in a country where developers could be forced by the government to introduce backdoors into their software without disclosing it.

Government Ties and Transparency Reports

Durov’s claim regarding the $3 million grant Signal received from the Open Technology Fund is easy to verify. However, it’s unclear how this partial initial funding should be relevant today. (For comparison, Brian Acton, co-founder of WhatsApp and current CEO of Signal, injected over $100 million into Signal.)

What Whittaker is referring to when she says that Telegram “routinely cooperates with governments behind the scenes,” however, isn’t quite clear. If she’s simply referring to instances where Telegram had to comply with applicable law, it would hardly be worth mentioning. If, on the other hand, she has some sort of inside information, it would be important to let Telegram users know.

What’s most interesting, though, is that despite accusing one another of undisclosed government ties, neither of these companies provides a transparency report worthy of the name. Telegram’s transparency report might as well be called “intransparency report.” It is regional, only accessible via the Telegram app, and doesn’t return any results for Switzerland, which is highly dubious considering the size of Telegram’s user base. Trying to retrieve reports for foreign countries like Germany or the US failed in our tests.

While Signal does have a transparency report that’s publicly accessible, it seems to be grossly incomplete and badly out of date. It only contains a total of five entries, the latest dating back to 2021. Compare that to the figures of our own (170 legal requests in 2023 alone) or, for example, Proton’s transparency report (6,378 requests in 2023). There’s simply no way a service of Signal’s size has only received five government requests in a time span of almost ten years. (Since Signal is based in the US, it might be possible that they have received a gag order preventing them from disclosing or publishing certain information.)

Closing Remarks

However deep government ties may or may not run, an important and objective metric is the transparency report. Users should be informed about (a) the type of user data a service is able to share with authorities, (b) the circumstances under which this will happen, and (c) the extent to which such data transfers have actually occurred in the past.

What if there’s no relevant data to share?

One might argue that government requests don’t matter all that much in cases where there’s no relevant data to obtain in the first place. While that’s true in theory, in practice, such cases aren’t the norm and require advanced measures only very few services provide (i.e., a custom push service, like Threema Push). Authorities are quite resourceful when it comes to determining the identity of Internet users, and a service might be used to indirectly reveal the identity of one of its users even though it itself has no knowledge of their identity at all.

In 2024, providing a proper transparency report should be a standard practice for any online service, but it’s especially important for services that care about user privacy and ones that require users to provide personal information (as both Signal and Telegram do).

Providing an incomplete transparency report not only defeats its whole purpose, it is, in some regard, even worse than not providing a transparency report at all as it misrepresents the facts and potentially deceives the users.