Telegram's founder Pavel Durov says his company only employs around 30 engineers. Security experts say that raises serious questions about the company's cybersecurity.
There are good reasons to dislike Telegram, but having "just" 30 engineers is not one of them. Software development is not a chair factory, more people does not equal more or better quality work as much as 9 women won't give birth to a baby in a month.
Edit:
Galperin told TechCrunch. “‘Thirty engineers’ means that there is no one to fight legal requests, there is no infrastructure for dealing with abuse and content moderation issues.”
I don't think fighting legal requests and content moderation is an engineer's job. However, the article can't seem to get it straight whether it's 30 engineers, or 30 staff overall. In the latter case, the context changes dramatically and I don't have the knowledge to tell if 30 staff is enough to deal with legal issues. I would imagine that Telegram would need a small army of lawyers and content moderators for that. Again, not engineers, though.
I can understand if someone like Google or Microsoft employs lawyers directly, as they have the resources and scale to do so. But someone like Telegram should really not do that. They should use an external legal office when needed. Even keep them on retainer, but definitely not open a legal office inside the company.
30 engineers. You lose half that to people managing the infrastructure alone. That leaves 15 code monkeys. Of 2 are dedicated to deployment and 3 to setting up unit tests (that's not many btw) you are left with 10 people. If say for a global platform that's not many at all.
If you have separate developers for writing unit tests, and not every developer writing them as they code, something is already very wrong in your project.
Deployment and infra should also mostly be setup and forget, by which I mean general devops, like setting up CI and infrastructure-as-code. Using modern practices, which lean towards continuous deployment, releasing a feature should just be a matter of toggling a feature flag. Any dev can do this.
Finally, if your developers are 'code monkeys', you're not ready for a project of this scale.
30 engineers is startup-sized. 30 engineers to deal with the needs of a sensitive software being used by millions worldwide, and is a huge target for cyberattacks? That's way below the threshold needed.
This sounds like the devs are personally, sword and shield in hand, defending the application from attacks, instead of just writing software which adheres to modern security practices, listening to the Security Officer and occasionally doing an audit.
To be fair, in a large company, there is usually only about 30 people who are actually good and know what is going on, and hundred of others who are checking in trash.
It's not even about the quality of individual people. The organizational structure of large companies encourages pointless work.
Internal mobility and cross department collaboration are frowned upon. So you get many people doing duplicate work, new ideas don't propagate, and even if someone has an idea it's quickly shut down.
The only way to achieve anything substantial is to be both: 1. assertive and energetic, and 2. at the correct level of hierarchy. And make no mistake even if you pull a miracle there will be no reward. Maybe a 3% raise at the yearly review.
Sorry for the rant, I currently work in a company like this.
Yeah. The most secure companies I’ve worked at actually only had a small group, of very competent people, who were paid well, treated with respect, and not presented with a lot of organizational or infrastructural red tape.
I’ve worked with teams of 10 that had shit locked down tight, and teams of hundreds who had software that was exploding and getting exploited left and right.
If someone tells you more head count = security, I would not consider them an expert.
Maybe I’m just lucky in where I am in a FAANG company, because I’ve only been offered mobility in my job, even directly after a promotion! We encourage work across the organization, but we have like 500 devs in this org.
30? Sometimes very less, 2 or 3. It's incredible that some piece of software used by milions/billions of people, have been written and sometimes maintained by 2 or 3 guys.
Even if every employee was equally competent, decision making needs to be consolidated enough that it can be decisive and shared throughout large companies. Complex systems that need to change rapidly gain no benefit from having too many people wanting to make decisions, you only need most of them to be competent enough to complete the work based on the decisions of a small group or the work will end up getting too convoluted and unmaintainable.
There really isn't a benefit to have everyone understand all of the parts of a large and complex system, if they only have time to work on a portion or to facilitate decisions that take into account the knowledge of the people in the different parts.
I see this parroted now and then. Often the people I've heard it from are the type of folks who would drastically underestimate the complexity and effort needed to make things. I've also seen and worked on codebases made by such folks and usually it ain't pretty, or maintainable, or extensible, or secure, or [insert fav cut corners here].
Headline is terrible. The big red flags are that they don't do end-to-end encryption by default, the servers are in Dubai, and use a proprietary algorithm.
Last part should be clarified further. They didn't reinvent AES or anything. It's more like a protocol that puts together existing algorithms. It means they can use transport layers without TLS or anything else that wraps your messages in crypto otherwise.
I'd still say this is a red flag. How you wrap encryption around your messages has several pits you can fall into. It's not as bad as reinventing AES, though.
They do explain though that given how below average their headcount is, it means they're likely understaffed, overworked, and have zero capacity to respond to intrusion attempts.
They seem to have 0 clue what they are “explaining “ though. I don’t know if those engineers are overworked or how (in)competent they are, I don’t even use telegram. But they apparently do have other non-engineering people on staff and content moderation and dealing with legal issues aren’t the job of an engineering team.
you can make a custom filter with ublock. I'm not seeing anything with the words trump, biden, us, texas, etc, including us politics related acronyms I have no idea about and that kept popping up 😅
Telegram founder Pavel Durov claimed in an interview that the company only employs "about 30 engineers."
Security experts say this is a major red flag for Telegram's cybersecurity, as it suggests the company lacks the resources to effectively secure its platform and fight off hackers.
Telegram's chats are not end-to-end encrypted by default, unlike more secure messaging apps like Signal or WhatsApp. Users have to manually enable the "Secret Chat" feature to get end-to-end encryption.
Telegram also uses its own proprietary encryption algorithm, which has raised concerns about its security.
As a social media platform with nearly 1 billion users, Telegram is an attractive target for both criminal and government hackers, but it seems to have very limited staff dedicated to cybersecurity.
Security experts have long warned that Telegram should not be considered a truly secure messaging app, and Durov's recent statement may indicate that the situation is worse than previously thought.
people have cast doubt over the quality of Telegram’s encryption, given that the company uses its own proprietary encryption algorithm, created by Durov’s brother
To be fair: someone somewhere has to make algorithms that we use. I honestly don’t know if Telegram’s encryption is strong or how strong based on their white paper, but I’m interested in an unbiased evaluation.
“Without end-to-end encryption, huge numbers of vulnerable targets, and servers located in the UAE? Seems like that would be a security nightmare,” Matthew Green, a cryptography expert at Johns Hopkins University, told TechCrunch. (Telegram spokesperson Remi Vaughn disputed this, saying it has no data centers in the UAE.)
Signal sucks from a UI/UX standpoint, when they dropped SMS support I lost any ability to convince people to switch, and everyone who had already switched left.
Then there's the seamless switching between devices...which it doesn't do.
The uae is a huge concern. Their terms demand they get to see your code. When the vPBX company I worked for tried to get into the uae, it was a 10mil boondoggle that ended up ruining them.
The regular chats are encrypted though, just with an (encrypted) server in the middle. Telegram also claims in their FAQ, that no one singular person has the power to decrypt and the keys are stored such that no singular government could force them to give up any data.
How far that is true is a different question though.
What if its not e2e encrypted if they dont care. I know a bunch of chatrooms where you can watch paid movies that was released recently for free and Telegram dont care
Telegram is basically creating its own "internet", albeit much less secure and private, but it's undoubtedly is really useful for finding dev communities (OSS), support, especially for gray areas like library gensis, z-book, a bit like what aaron shwarz envisioned, the only issue is tying everything to your trust in its leadership not to misuss data, which is kinda laughable
There was a post about this on lemmy awhile ago, I'm not sure which specific community it was i'm subscribed to a few tech related ones, but it was atleast a week or 2 or more ago about this same story.
I do agree that there should be more workers than 30 on one of the most known encrypted messaging apps.
After a long-running blogpost holywar between Telegram and Signal, I perceive these "security experts" as Signal/Telegram shills depending on their stance