News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

MacOS Big Sur is spying on everything you do and sending the data to Apple

Started by Redaktion, November 17, 2020, 18:33:51

Previous topic - Next topic

Redaktion

Apple's latest operating system, macOS Big Sur, uses a new API to constantly send users' data (including how, when, and where a Mac is used) to Apple. This data is transmitted to Apple without encryption, meaning anyone with access to the same network as the Mac can see the information.

https://www.notebookcheck.net/MacOS-Big-Sur-is-spying-on-everything-you-do-and-sending-the-data-to-Apple.504381.0.html

Ncfan

It's not a new feature, actually users experienced it when big sur was released, but they were still running Catalina. So it's a clickbait article.

Ste

Well, Apple already reacted to the partially mistaken report of Jeffrey Paul, telling that no hack is actually sent and that it will stop logging IP adresses.

In the mean time, another expert, Jacopo Jannone, shared a different analysis, more aligned with Apple explanations.

The article should be updated accordingly, in my view.

You can see the updated article of 9to5mac.


Mate

#2 They wont track IP anymore? Who cares? Point is that they were doing it  behind our backs and used they supposedly 'respect' for users privacy in advertisment. Hypocrites.

t4n0n

Quote from: Ste on November 17, 2020, 19:03:37
Well, Apple already reacted to the partially mistaken report of Jeffrey Paul, telling that no hack is actually sent and that it will stop logging IP adresses.

In the mean time, another expert, Jacopo Jannone, shared a different analysis, more aligned with Apple explanations.

The article should be updated accordingly, in my view.

You can see the updated article of 9to5mac.



The original author has addressed the points made by Jacopo, as you can see from his now updated blog post.

Based on the reaction on the YCombinator comments, most people don't  agree with Jannone's findings and mirror the comments made by Paul.

A

Not a surprise, even with mobile, Apple was the first one to put spyware into their phones.

JoeBlack

Bypassing VPN and non encrypted data transfer.
That Win10 telemetry does not sound all that bad now!
:D

MxViking

At the end of the day, most folks will not care either way.

Companies like Apple, Google, Facebook, etc. will continue to get away with as much as possible in regards to collecting, using and/or selling user data.

An expose here and there is just a small bump for them. They know that most folks will continue using and buying.

kek

Like the previous comment said, common folks dont care at all about stuff like this. There's a lot of people out there who would literally do dangerous stuff just to get an Apple device.

Gigaboly

T2 security chip and now M1, who the heck want to buy macbook? they will be slower than intel pentium with this new phone cpu in it anyway...

vertigo

This is bad on so many levels. In addition to the hypocrisy mentioned, and the fact it's not just a privacy violation, but a security issue and an active interference in the use of people's personal computers/property, but the timing couldn't be worse for them, with this coming out during the congressional hearing and coming lawsuits. It's like they want to be fined and/or broken up. I hope they're held to account for this, both by the US government and the EU, since I'm sure this is a violation of the GDPR.

As for the comments regarding common people not caring about this, a) true, but sad, and I'll never understand how people can just not care at all about this kind of thing, and b) it's one thing for companies to do stuff like this and have it affect those people, and since they don't care I certainly don't (about them, not about the larger issue), but at least there's typically a way for those that do care to work around this kind of thing, but Apple taking steps to remove even that possibility is really taking it to the next level. Not surprising, though, since they've always been about making people bend to their whims, dictating what people can do with their products and how. I'd like to think they're going to put themselves into a corner by making their products more niche (limitations of their new chip, e.g. no more Bootcamp) and by alienating more and more people with actions like this, but the sad truth is that, as others have mentioned, it won't matter. People will still keep refusing to see Apple for what it is.

_MT_

Quote from: vertigo on November 18, 2020, 05:00:00
As for the comments regarding common people not caring about this, a) true, but sad, and I'll never understand how people can just not care at all about this kind of thing
I guess it's because they can't imagine how it could do harm. And frankly, even people inventing and developing such technologies often can't imagine where it will lead and how it can be (mis)used.

If there is no way to opt out, that in itself should make it illegal. It is useful to know how people use your products. But all such campaigns should be strictly opt-in and there should be clear rules - what you're collecting, what for and how long are you going to keep it. If you're doing a probe into user's behaviour to inform future development, you don't need to keep the data for ten years. You can't collect for no reason. Or just in case. Nor can you use the data for another purpose.

However, there is one question I have: are we talking about units in a developer program, or standard retail units? Since retail units were supposed to start shipping on the 17th, if I'm not mistaken, I guess we're talking developers here. And that can complicate the situation a bit (they have a contract with Apple, right?).

Encryption is a double edged sword. Like everything, really (every coin has two sides). It was irresponsible of them to not encrypt the data. On the other hand, effective encryption would prevent us from figuring out what is our device sending.

Henry Johannson

That's disappointing. They really need to be held accountable for this as it's not right.

vertigo

Quote from: _MT_ on November 18, 2020, 15:00:04
Quote from: vertigo on November 18, 2020, 05:00:00
As for the comments regarding common people not caring about this, a) true, but sad, and I'll never understand how people can just not care at all about this kind of thing
I guess it's because they can't imagine how it could do harm. And frankly, even people inventing and developing such technologies often can't imagine where it will lead and how it can be (mis)used.

If there is no way to opt out, that in itself should make it illegal. It is useful to know how people use your products. But all such campaigns should be strictly opt-in and there should be clear rules - what you're collecting, what for and how long are you going to keep it. If you're doing a probe into user's behaviour to inform future development, you don't need to keep the data for ten years. You can't collect for no reason. Or just in case. Nor can you use the data for another purpose.

However, there is one question I have: are we talking about units in a developer program, or standard retail units? Since retail units were supposed to start shipping on the 17th, if I'm not mistaken, I guess we're talking developers here. And that can complicate the situation a bit (they have a contract with Apple, right?).

Encryption is a double edged sword. Like everything, really (every coin has two sides). It was irresponsible of them to not encrypt the data. On the other hand, effective encryption would prevent us from figuring out what is our device sending.

Unfortunately, that theory doesn't seem to hold much water, at least not all the time. Yes, some people just have no clue, but that's getting a lot harder with the constant news about this kind of stuff, to the point you'd have to be very technically illiterate and/or oblivious to not know by now about this kind of stuff. But more significantly, almost every single person I've explained the issue, and the possible repercussions involved, has ranged from couldn't care less (most commonly) to somewhat interested, but clearly not enough to actually do anything to protect themselves or discourage the behavior. As for the creators not realizing the potential impact of their creations, yes, that's true sometimes, and hindsight is 20/20, but in cases like this, there's no way a company like Apple, or any company with even a modicum of tech knowledge, would be unaware of the potential for harm here.

As for the encryption, if Apple was transparent about what they're doing, they could easily have it log the info locally for the user to be able to see what's going on then encrypt and send it. Of course, that would require trust that they're actually sending what they say they are, though that would be easier to do if they were being transparent instead of being caught red-handed. I'm sure they could also have the encryption keys on each computer (obviously different for each one), so the user could intercept the traffic and use the key to decrypt it to police them, whereas anyone else wouldn't have the key and so the traffic would still be protected.

I'm confident a company like Apple could figure out a way to both protect the user and be transparent if they actually cared to do so. This situation shows very clearly that they don't, and that all they care about is collecting data on people and have no concern at all about how they do it or the harm it causes those people. And that, along with the secrecy, is why these companies deserve to be knocked down a peg or three, and hopefully that's going to happen.

As for the developer issue, maybe I'm misinterpreting something, but my understanding is that this issue is with the latest macOS update (Big Sur), not with the units/computers themselves. So it would seem this is something that affects all users.

_MT_

I have to correct myself. I was under the impression that Big Sur was ARM-only. I wasn't expecting it to roll out to x86 as well.

If you have a service that's checking whether a certificate wasn't revoked for some reason, signalling that an application might not be secure, that traffic can reveal what you've been running. That service has been around for a long time, I think since Sierra. They certainly shouldn't keep logs on who asked about what. There is no need for it (to render the service) and therefore they shouldn't (definitely not for long). But you're sending the information over and you can only trust them not to keep it. I guess the only way around it would be to flip the information flow - publish a list of all revoked certificates so that your computer can check internally. From legal perspective, a problem here is that something like information identifying a certificate used to sign an application or application itself won't be considered personal. It can only become personal in combination with other data. Consider that IP address is not necessarily personal. E-mail address is not necessarily personal either. In the position Apple is in, they probably can link IP address with a person or a household, for a significant portion of the user base. Therefore, they should treat it as personal information. And that might be why they'll be deleting it.

Privacy and public's take on it are interesting topics. Consider how people like to treat pictures including faces of strangers or registration plates of cars. Where I come from, it's generally illegal to publish such a picture without removing offending parts or getting consent. But it seems like people couldn't care less. I guess it would be too much work for them to edit their photos before publishing them.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview