News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by vertigo
 - November 21, 2020, 01:15:22
Quote from: _MT_ on November 20, 2020, 17:45:33
Have your read the T&Cs? It's data. They can sell it, exchange it, share it. For example, they can create an association that pools the data and has it analyzed on behalf of the members so they can gain similar insights as e-shops do. The most problematic part here is exchanging identifying information so they can link you across loyalty programs. Unless they fancy braking law, they need consent. And they can get it through your membership. I do wonder if perhaps hashing identifying information could be of use here. It can be done so the original identifying information is not retrievable (they can't retrieve your name or address), but it would still be unique to you (or at least very close to) and therefore allow linking as long as they all use the same method. Of course, it's not bulletproof. It's feasible to create reverse lookup table. You've got one calculation per person. You just need the information. It doesn't give you the information. But if you have it, it doesn't prevent linking it. Facilitating linking is, after all, the goal.

As for the Utopia, their argument is flawed. Open source doesn't necessarily mean free to use however you want. It's possible to publish sources without giving permission to create derivatives. Even commercial software can be open source. Peer review is really important. It's so easy to screw up.

Unfortunately, I'm guilty of not reading those just like most people. I don't have the time or legal expertise. And that's why laws need to be passed to reign them in, but that's unlikely to happen, because government officials don't work for the people, they work for companies that buy them off. And what you're describing is essentially what your advertising ID with Google is. And just like I'm sure Google still knows exactly who you are, and the only things the ID provide are possibly anonymity from third parties and some (probably false) peace of mind, using such a system with loyalty programs would probably be the same, but better than nothing I guess.

Yeah, Utopia doesn't really inspire much confidence IMO, certainly not enough to trust them with what they're asking.
Posted by _MT_
 - November 20, 2020, 17:45:33
Quote from: vertigo on November 19, 2020, 18:20:36
With the loyalty programs, I understand how they're used to track purchasing patterns with the company, but I don't see how it could be used to track you outside the company, i.e. to see where else you shop or what else you buy.
Have your read the T&Cs? It's data. They can sell it, exchange it, share it. For example, they can create an association that pools the data and has it analyzed on behalf of the members so they can gain similar insights as e-shops do. The most problematic part here is exchanging identifying information so they can link you across loyalty programs. Unless they fancy braking law, they need consent. And they can get it through your membership. I do wonder if perhaps hashing identifying information could be of use here. It can be done so the original identifying information is not retrievable (they can't retrieve your name or address), but it would still be unique to you (or at least very close to) and therefore allow linking as long as they all use the same method. Of course, it's not bulletproof. It's feasible to create reverse lookup table. You've got one calculation per person. You just need the information. It doesn't give you the information. But if you have it, it doesn't prevent linking it. Facilitating linking is, after all, the goal.

As for the Utopia, their argument is flawed. Open source doesn't necessarily mean free to use however you want. It's possible to publish sources without giving permission to create derivatives. Even commercial software can be open source. Peer review is really important. It's so easy to screw up.
Posted by vertigo
 - November 19, 2020, 18:34:36
Quote from: Hawkin on November 19, 2020, 14:16:41
Want some anonymous  and cyber security?
Join Utopia Ecosystem and sleep well)

This appears to be an attempt to replicate Tor, as well as building in more functionality. Considering such a system is only effective with a lot of users, I wouldn't really rely on it for a while, until it's able to build up a decent user base. I also wouldn't rely on it for a while just because it's new, and you should never trust beta products, which this essentially is, for security and privacy. Yet another reason I wouldn't trust it is that it's not open-source, and therefore can't be audited. They explain their reasoning in the FAQ*, but regardless of that, if the code can't be checked, there's no telling if a) it's doing what it says it's doing, b) there's no backdoors, and c) it's done properly, i.e. that it doesn't contain bad, insecure code.

*I've copy-pasted the relevant part of the FAQ below, which covers why it's not open-source, since they seem to care more about making it look all hackerish with a fancy green font on black background that simultaneously makes it difficult to read and makes it look unprofessional and like they're trying too hard to make it come across a certain way. Call me crazy, but I have trouble placing faith in people that can't even ensure their website is legible to write and self-audit code that's responsible for ensuring privacy.

"We may disclose certain parts of code, specifically related to communication and encryption. However, the decentralized protocol will not be released. Utopia is very knowledge-intensive software. A lot of time, effort and resources went into this product, and we do not want to share all of our know-how as it will result in forks which in turn may result in instability of our main network. Fork will lead to the division of the community, while our intention is the unification of the community of like-minded individuals. The bottom line here is that a lot of software is closed source, and this does not hurt them a bit. In addition, we will audit our code.
Posted by vertigo
 - November 19, 2020, 18:20:36
Quote from: _MT_ on November 19, 2020, 11:38:02
Quote from: vertigo on November 18, 2020, 18:03:38
IMO, IP address should absolutely be considering personal, since it gives your approximate location and can be used in combination with other data to determine your identity and to build a profile on you.
...
As an interesting side note, which would be humorous if it weren't so sad, is that when I search Big Sur to see if it is for x86 (didn't look too deep, but it does appear it is), the first result was from Apple, which says "macOS Big Sur elevates Mac to a new level of power and beauty with a refined new design, major app updates, and more transparency around your privacy" (emphasis added). /smh
This has been going on for years. The problem, the change with Big Sur, if I understand it correctly, is that user level firewall can no longer block system level processes (which includes trustd). And the same is true for VPN. User level VPN software cannot shove traffic from system level processes into the tunnel. Those processes have to cooperate on their own. And yes, this is potentially very concerning (it could make you, for example, more vulnerable on public wireless networks). It depends on how much you trust Apple to do a good job.

I don't think approximate location is good enough. I'm not sure even street level location would be enough, much less a quarter. A German court argued that a website operator could persuade ISP to disclose your personal information and therefore IP is personal (they presented two lines of thinking and this was one of them). To me, that sounds like bollocks. ISP shouldn't disclose personal information outside of court orders and such. Doing so should be illegal and harshly punished. The real argument for me is that while a particular operator might not be able to match IP with identifying information, someone else might. Why do we even care? It's partly because we treat non-personal data poorly. It's going to be more likely for the data to be given to someone else or stolen and otherwise mishandled. It should definitely be properly protected.

The reality is that I don't need to keep your IP address outside of compliance with law (where I should treat it as personal) and prevention of network attacks or diagnosis of network issues (where I should store it separately and only for short term). I care about performance. I care about errors. I might care about what you requested and where you came from when an error occurred. Consider a broken link. Knowing where you came from is a big help. But I don't care who you are. As a human. I care about you as a user. I can identify you in a way that doesn't disclose your IP address and is not persistent beyond a browsing session. And use that in logs. Making it useless to third parties if my logs get stolen. But I will still be able to track you throughout my website for the purposes of usage analysis. Of course, if you're a shop, for example, it might be possible to link this ID with personal information via orders and invoices (time stamps, order numbers in URLs, etc.). That part of your system should be built accordingly. Once a problem is fixed, there is no need to keep an expanded log entry. If there were no problems, there is no need to keep the logs for long. I might want to keep performance data, but I care about the numbers for future reference, not individuals. If you're analyzing effectiveness of a marketing campaign, you should delete the data afterwards or periodically if it's long running. There is always a way. The reasons to desire such information are not technical.

The problem in this space is called Google Analytics and share and like buttons from social networks. And of course, advertising networks. GA is a nice tool. I get why people use it. But frankly, some of that information is simply none of your business. I understand that an operator of one shop might want to know where else are his customers shopping. But they have no right to snoop like that. Imagine hiring people to follow your customers around and log where they go, what they do. Creepy as hell. Too expensive in real life (that's why they use different methods - like loyalty programs), easy to do on the Internet. Tracking should be strictly opt-in with a strict ban on tricking, manipulating, pushing and bullying. Of course they don't like the idea as almost nobody would cooperate. They would probably have to pay the people money to entice them into giving up privacy (like it's done with loyalty programs).

Approximate location can be used in combination with other details to identify you or, at the very least, narrow it down quite a bit. Depending on who's trying to identify you, they could use a combination of any number of things, such as sites visited, what browser you're using, screen resolution, installed fonts, OS, how you type, mouse movements, etc. IP can help by providing more info on you (where you live, whether down to the city or, in more rural areas, down to one of very few houses) and by vastly reducing your anonymity pool (instead of having to separate you from millions of people using the other available factors, now you only need to be separated from anywhere from a couple dozen to several thousand).

And I agree ISP shouldn't be able to disclose it; unfortunately, they're allowed to do much more than that in the US, with a recent law allowing them to collect info on you, e.g. sites you visit and probably things like your schedule and how much time you spend online, and sell it. So even though you're (over)paying them for a service, that's still not enough for them, and they want to make even more money off of you on the back end. It's reprehensible, and I wish I could say it's unbelievable the law passed, but that's just the state of the corrupted government these days.

I also agree about the issues with tracking, which, combined with the ISP issue, is why I use a VPN combined with uBO, Privacy Badger, and other precautions. It's just ridiculous that it requires so much extra work and money to protect oneself because not only does the government do nothing to help, but they're often complicit. And even then, doing all that, I'm well aware that my identity is still likely known to at least some players (government and Google at least), even when it appears that I'm anonymous. It's crazy. And most people simply don't know how bad it is, or how to protect against it. And as you're probably aware, it's so much worse than social media buttons and analytics and ads. Things like fingerprinting, tracking pixels, cookies, referers, AMP, etc. And that's just the web. Then there's OS snooping, Amazon and Google listening to and recording you through your phones/tablets/Echo/etc, Google (and probably Apple) tracking you through your phone, even after you "disable" it, and so on. And then there's companies using people to do free work for them, training their systems, by providing tons and tons of photos for facial recognition and solving captchas to train image recognition. It's all just a big mess.

With the loyalty programs, I understand how they're used to track purchasing patterns with the company, but I don't see how it could be used to track you outside the company, i.e. to see where else you shop or what else you buy. So while you're losing a little bit of privacy, it's not much. Instead of the company just knowing they sold products A and B, they know they sold them to you specifically, though unless you pay cash they can know that anyway, or at least that credit card # 1234.... bought those items. And if they really wanted to, they could probably tie that card to you even without your knowledge or consent. So I don't see loyalty programs as being much of a privacy issue personally, though I'd be interested to know if I'm missing something regarding them.
Posted by Hawkin
 - November 19, 2020, 14:16:41
Want some anonymous  and cyber security?
Join Utopia Ecosystem and sleep well)
Posted by _MT_
 - November 19, 2020, 11:38:02
Quote from: vertigo on November 18, 2020, 18:03:38
IMO, IP address should absolutely be considering personal, since it gives your approximate location and can be used in combination with other data to determine your identity and to build a profile on you.
...
As an interesting side note, which would be humorous if it weren't so sad, is that when I search Big Sur to see if it is for x86 (didn't look too deep, but it does appear it is), the first result was from Apple, which says "macOS Big Sur elevates Mac to a new level of power and beauty with a refined new design, major app updates, and more transparency around your privacy" (emphasis added). /smh
This has been going on for years. The problem, the change with Big Sur, if I understand it correctly, is that user level firewall can no longer block system level processes (which includes trustd). And the same is true for VPN. User level VPN software cannot shove traffic from system level processes into the tunnel. Those processes have to cooperate on their own. And yes, this is potentially very concerning (it could make you, for example, more vulnerable on public wireless networks). It depends on how much you trust Apple to do a good job.

I don't think approximate location is good enough. I'm not sure even street level location would be enough, much less a quarter. A German court argued that a website operator could persuade ISP to disclose your personal information and therefore IP is personal (they presented two lines of thinking and this was one of them). To me, that sounds like bollocks. ISP shouldn't disclose personal information outside of court orders and such. Doing so should be illegal and harshly punished. The real argument for me is that while a particular operator might not be able to match IP with identifying information, someone else might. Why do we even care? It's partly because we treat non-personal data poorly. It's going to be more likely for the data to be given to someone else or stolen and otherwise mishandled. It should definitely be properly protected.

The reality is that I don't need to keep your IP address outside of compliance with law (where I should treat it as personal) and prevention of network attacks or diagnosis of network issues (where I should store it separately and only for short term). I care about performance. I care about errors. I might care about what you requested and where you came from when an error occurred. Consider a broken link. Knowing where you came from is a big help. But I don't care who you are. As a human. I care about you as a user. I can identify you in a way that doesn't disclose your IP address and is not persistent beyond a browsing session. And use that in logs. Making it useless to third parties if my logs get stolen. But I will still be able to track you throughout my website for the purposes of usage analysis. Of course, if you're a shop, for example, it might be possible to link this ID with personal information via orders and invoices (time stamps, order numbers in URLs, etc.). That part of your system should be built accordingly. Once a problem is fixed, there is no need to keep an expanded log entry. If there were no problems, there is no need to keep the logs for long. I might want to keep performance data, but I care about the numbers for future reference, not individuals. If you're analyzing effectiveness of a marketing campaign, you should delete the data afterwards or periodically if it's long running. There is always a way. The reasons to desire such information are not technical.

The problem in this space is called Google Analytics and share and like buttons from social networks. And of course, advertising networks. GA is a nice tool. I get why people use it. But frankly, some of that information is simply none of your business. I understand that an operator of one shop might want to know where else are his customers shopping. But they have no right to snoop like that. Imagine hiring people to follow your customers around and log where they go, what they do. Creepy as hell. Too expensive in real life (that's why they use different methods - like loyalty programs), easy to do on the Internet. Tracking should be strictly opt-in with a strict ban on tricking, manipulating, pushing and bullying. Of course they don't like the idea as almost nobody would cooperate. They would probably have to pay the people money to entice them into giving up privacy (like it's done with loyalty programs).
Posted by vertigo
 - November 18, 2020, 18:03:38
IMO, IP address should absolutely be considering personal, since it gives your approximate location and can be used in combination with other data to determine your identity and to build a profile on you. I have no doubt a company like Apple, Google, Facebook, Microsoft, Amazon, etc, can make use of that to further invade on people's privacy.

I also definitely agree that certificate databases should be maintained on individual computers, rather than performing cloud lookups. I hate the trend of security software to rely on the cloud. Not only is it a massive privacy violation (which I wouldn't be surprised is the real reason for doing it that way), but it's a real problem when on a slow or metered connection. I had to disable Microsoft's SmartScreen "protection" because it never worked, and when trying to open apps it would just sit there for a minute or two, seemingly doing nothing, because it couldn't connect and look it up. There's no reason whatsoever to not just have a local database that's kept up to date and can be checked, and many other security apps do just that.

As an interesting side note, which would be humorous if it weren't so sad, is that when I search Big Sur to see if it is for x86 (didn't look too deep, but it does appear it is), the first result was from Apple, which says "macOS Big Sur elevates Mac to a new level of power and beauty with a refined new design, major app updates, and more transparency around your privacy" (emphasis added). /smh
Posted by _MT_
 - November 18, 2020, 16:57:18
I have to correct myself. I was under the impression that Big Sur was ARM-only. I wasn't expecting it to roll out to x86 as well.

If you have a service that's checking whether a certificate wasn't revoked for some reason, signalling that an application might not be secure, that traffic can reveal what you've been running. That service has been around for a long time, I think since Sierra. They certainly shouldn't keep logs on who asked about what. There is no need for it (to render the service) and therefore they shouldn't (definitely not for long). But you're sending the information over and you can only trust them not to keep it. I guess the only way around it would be to flip the information flow - publish a list of all revoked certificates so that your computer can check internally. From legal perspective, a problem here is that something like information identifying a certificate used to sign an application or application itself won't be considered personal. It can only become personal in combination with other data. Consider that IP address is not necessarily personal. E-mail address is not necessarily personal either. In the position Apple is in, they probably can link IP address with a person or a household, for a significant portion of the user base. Therefore, they should treat it as personal information. And that might be why they'll be deleting it.

Privacy and public's take on it are interesting topics. Consider how people like to treat pictures including faces of strangers or registration plates of cars. Where I come from, it's generally illegal to publish such a picture without removing offending parts or getting consent. But it seems like people couldn't care less. I guess it would be too much work for them to edit their photos before publishing them.
Posted by vertigo
 - November 18, 2020, 16:45:24
Quote from: _MT_ on November 18, 2020, 15:00:04
Quote from: vertigo on November 18, 2020, 05:00:00
As for the comments regarding common people not caring about this, a) true, but sad, and I'll never understand how people can just not care at all about this kind of thing
I guess it's because they can't imagine how it could do harm. And frankly, even people inventing and developing such technologies often can't imagine where it will lead and how it can be (mis)used.

If there is no way to opt out, that in itself should make it illegal. It is useful to know how people use your products. But all such campaigns should be strictly opt-in and there should be clear rules - what you're collecting, what for and how long are you going to keep it. If you're doing a probe into user's behaviour to inform future development, you don't need to keep the data for ten years. You can't collect for no reason. Or just in case. Nor can you use the data for another purpose.

However, there is one question I have: are we talking about units in a developer program, or standard retail units? Since retail units were supposed to start shipping on the 17th, if I'm not mistaken, I guess we're talking developers here. And that can complicate the situation a bit (they have a contract with Apple, right?).

Encryption is a double edged sword. Like everything, really (every coin has two sides). It was irresponsible of them to not encrypt the data. On the other hand, effective encryption would prevent us from figuring out what is our device sending.

Unfortunately, that theory doesn't seem to hold much water, at least not all the time. Yes, some people just have no clue, but that's getting a lot harder with the constant news about this kind of stuff, to the point you'd have to be very technically illiterate and/or oblivious to not know by now about this kind of stuff. But more significantly, almost every single person I've explained the issue, and the possible repercussions involved, has ranged from couldn't care less (most commonly) to somewhat interested, but clearly not enough to actually do anything to protect themselves or discourage the behavior. As for the creators not realizing the potential impact of their creations, yes, that's true sometimes, and hindsight is 20/20, but in cases like this, there's no way a company like Apple, or any company with even a modicum of tech knowledge, would be unaware of the potential for harm here.

As for the encryption, if Apple was transparent about what they're doing, they could easily have it log the info locally for the user to be able to see what's going on then encrypt and send it. Of course, that would require trust that they're actually sending what they say they are, though that would be easier to do if they were being transparent instead of being caught red-handed. I'm sure they could also have the encryption keys on each computer (obviously different for each one), so the user could intercept the traffic and use the key to decrypt it to police them, whereas anyone else wouldn't have the key and so the traffic would still be protected.

I'm confident a company like Apple could figure out a way to both protect the user and be transparent if they actually cared to do so. This situation shows very clearly that they don't, and that all they care about is collecting data on people and have no concern at all about how they do it or the harm it causes those people. And that, along with the secrecy, is why these companies deserve to be knocked down a peg or three, and hopefully that's going to happen.

As for the developer issue, maybe I'm misinterpreting something, but my understanding is that this issue is with the latest macOS update (Big Sur), not with the units/computers themselves. So it would seem this is something that affects all users.
Posted by Henry Johannson
 - November 18, 2020, 15:58:38
That's disappointing. They really need to be held accountable for this as it's not right.
Posted by _MT_
 - November 18, 2020, 15:00:04
Quote from: vertigo on November 18, 2020, 05:00:00
As for the comments regarding common people not caring about this, a) true, but sad, and I'll never understand how people can just not care at all about this kind of thing
I guess it's because they can't imagine how it could do harm. And frankly, even people inventing and developing such technologies often can't imagine where it will lead and how it can be (mis)used.

If there is no way to opt out, that in itself should make it illegal. It is useful to know how people use your products. But all such campaigns should be strictly opt-in and there should be clear rules - what you're collecting, what for and how long are you going to keep it. If you're doing a probe into user's behaviour to inform future development, you don't need to keep the data for ten years. You can't collect for no reason. Or just in case. Nor can you use the data for another purpose.

However, there is one question I have: are we talking about units in a developer program, or standard retail units? Since retail units were supposed to start shipping on the 17th, if I'm not mistaken, I guess we're talking developers here. And that can complicate the situation a bit (they have a contract with Apple, right?).

Encryption is a double edged sword. Like everything, really (every coin has two sides). It was irresponsible of them to not encrypt the data. On the other hand, effective encryption would prevent us from figuring out what is our device sending.
Posted by vertigo
 - November 18, 2020, 05:00:00
This is bad on so many levels. In addition to the hypocrisy mentioned, and the fact it's not just a privacy violation, but a security issue and an active interference in the use of people's personal computers/property, but the timing couldn't be worse for them, with this coming out during the congressional hearing and coming lawsuits. It's like they want to be fined and/or broken up. I hope they're held to account for this, both by the US government and the EU, since I'm sure this is a violation of the GDPR.

As for the comments regarding common people not caring about this, a) true, but sad, and I'll never understand how people can just not care at all about this kind of thing, and b) it's one thing for companies to do stuff like this and have it affect those people, and since they don't care I certainly don't (about them, not about the larger issue), but at least there's typically a way for those that do care to work around this kind of thing, but Apple taking steps to remove even that possibility is really taking it to the next level. Not surprising, though, since they've always been about making people bend to their whims, dictating what people can do with their products and how. I'd like to think they're going to put themselves into a corner by making their products more niche (limitations of their new chip, e.g. no more Bootcamp) and by alienating more and more people with actions like this, but the sad truth is that, as others have mentioned, it won't matter. People will still keep refusing to see Apple for what it is.
Posted by Gigaboly
 - November 18, 2020, 04:35:56
T2 security chip and now M1, who the heck want to buy macbook? they will be slower than intel pentium with this new phone cpu in it anyway...
Posted by kek
 - November 17, 2020, 21:34:56
Like the previous comment said, common folks dont care at all about stuff like this. There's a lot of people out there who would literally do dangerous stuff just to get an Apple device.
Posted by MxViking
 - November 17, 2020, 21:15:04
At the end of the day, most folks will not care either way.

Companies like Apple, Google, Facebook, etc. will continue to get away with as much as possible in regards to collecting, using and/or selling user data.

An expose here and there is just a small bump for them. They know that most folks will continue using and buying.