Agreed, users do sign off on some data collection. However, this consent doesn't extend to all the app's actions, especially the undisclosed ones. That's precisely where access to source code becomes critical. It allows us to identify any hidden, potentially harmful features that could breach privacy or security—issues that go beyond what users have signed up for. Full transparency in source code is vital to ensure that an app fully respects user agreements and trust.
apps can't use hardware features without requesting permissions. it's a security measure that both iOS and Android enforce. I know this because I develop apps. you don't have to look at the source code for this.
time to stop arguing with an expert on the matter. you really just don't know what you're talking about.
Expertise is one thing, but assuming permissions make source code access irrelevant? That's a stretch. Real tech pros know layers of security are what keep us safe, not just gatekeeping features. If that's your expert take, I'm skeptical.
ok buddy. you're basically just spouting tech terms like you have any clue. the evidence that you have absolutely none is your insistence in arguing with an expert in their field. you can't get any more dunning Kruger than this.
good job being a textbook example of one of the world's biggest problems: dummies thinking they know better than actual experts.
Expert or not, the point stands that open source helps everyone check what's really going on. That's not Dunning-Kruger; it's common sense. You're not an expert if you don't understand these basic concepts.
you've strayed from your original argument. the context is that you're supporting a lunatic because he thinks that looking at source code is somehow better than banning foreign apps outright. we already know what these apps are doing and source code doesn't give any further insight.
Just banning foreign apps? That's a knee-jerk reaction, not a solution. We need to know what's in the code, not just where an app's made. Security comes from transparency, not blind bans. Let's not mix xenophobia with tech policy.
Are you daft? Yes. We know what they are doing. Apps inherently have access to a lot of user data. It's not about trust in the code, it's about trust in the company. No amount of looking at TikTok's source code changes what the company may or may not do with all that data it is absolutely collecting.
If you examine the source code of all popular apps you will find that they all collect and send home as much user data as the user has permitted, which is usually a lot. This information accomplishes nothing. The reason that some apps should be banned is because of what the company does with that data and how it doesn't comply with laws from the banning country.
This all stems from a useless talking point made by a politician which sounds great but doesn't actually accomplish anything. Feel free to keep arguing, but at this point you're basically just telling me that the internet is a series of tubes.
Trust in the company is important, but it's not the whole picture. Reviewing source code can reveal how data is handled on the front end, which is our first line of defense. If the front-end code is designed to collect more data than it should, that's a problem, regardless of the company's reputation.
It's all about user-permissions, which users are granting. It is already known what data apps are collecting. Can you guess what the authors of the study didn't have in order to determine what the apps are doing? The source code.
If users give an app permission (which they always do if they want to use it), please enlighten me on what data you think an app collects that is more than it should?
Permissions don't cover everything. Say an app has access to photos — that's a green light to use them, but how do we know it's not overstepping? Without checking the source code, we can't. It could be hoovering up all your photos right now. That's why we need to look deeper than permissions.