Voluntarily sharing informative posts from unaffiliated sources.

  • 38 Posts
  • 0 Comments
Joined 1Y ago
cake
Cake day: Jan 16, 2024

help-circle
rss
>Signal has announced new functionality in its upcoming beta releases, allowing users to transfer messages and media when linking their primary Signal device to a new desktop or iPad. This feature offers the choice to carry over chats and the last 45 days of media, or to start fresh with only new messages. > >The transfer process is end-to-end encrypted, ensuring privacy. It involves creating a compressed, encrypted archive of your Signal data, which is then sent to the new device via Signal's servers. Despite handling the transfer, the servers cannot access the message content due to the encryption. > >With the introduction of a cross-platform archive format, Signal is also exploring additional tools for message transfer to new devices or restoration in case of device loss or damage. Users can begin testing this feature soon, with a wider rollout expected in the coming weeks.
fedilink

>Bitwarden users who store their email account credentials within their Bitwarden vaults would have trouble accessing the sent codes if they are unable to log in to their email. >To prevent getting locked out of your vault, be sure you can access the email associated with your Bitwarden account so you can access the emailed codes, or turn on any form of two-step login to not be subject to this process altogether.
fedilink

>Last month, Ente launched https://theyseeyourphotos.com/, a website and marketing stunt designed to turn Google’s technology against itself. People can upload any photo to the website, which is then sent to a Google Cloud computer vision program that writes a startlingly thorough three-paragraph description of it. (Ente prompts the AI model to document small details in the uploaded images.) >If you don’t want to upload your own picture, Ente gives people the option to experiment on Theyseeyourphotos using one of several stock images. Google’s computer vision is able to pick up on subtle details in them, like a person’s tattoo that appears to be of the letter G, or a child’s temporary tattoo of a leaf. “The whole point is that it is just a single photo,” Mohandas says. He hopes the website prompts people to imagine how much Google—or any AI company—can learn about them from analyzing thousands of their photos in the cloud in the same way.
fedilink


>A bipartisan group of 12 senators has urged the Transportation Security Administration’s inspector general to investigate the agency’s use of facial recognition, saying it poses a significant threat to privacy and civil liberties. >“This technology will soon be in use at hundreds of major and mid-size airports without an independent evaluation of the technology’s precision or an audit of whether there are sufficient safeguards in place to protect passenger privacy,” the senators wrote. >“While the TSA claims facial recognition is optional, it is confusing and intimidating to opt out of TSA’s facial recognition scans, and our offices have received numerous anecdotal reports of Transportation Security Officers (TSOs) becoming belligerent when a traveler askes to opt out, or simply being unaware of that right,” the senators wrote. They added that in some airports the signage instructing flyers to step in front of a camera is prominently displayed while signs advising passengers of their right to opt out of face scan is “strategically placed in inconspicuous locations.” >To opt out of a face scan at an airport, a traveler need only say that they decline facial recognition. They can then proceed normally through security by presenting an identification document, such as a driver’s license or passport.
fedilink

>New research reveals serious privacy flaws in the data practices of new internet connected cars in Australia. It’s yet another reason why we need urgent reform of privacy laws. > >Modern cars are increasingly equipped with internet-enabled features. Your “connected car” might automatically detect an accident and call emergency services, or send a notification if a child is left in the back seat. > >But connected cars are also sophisticated surveillance devices. The data they collect can create a highly revealing picture of each driver. If this data is misused, it can result in privacy and security threats. > >A report published today analysed the privacy terms from 15 of the most popular new car brands that sell connected cars in Australia. > >This analysis uncovered concerning practices. There are enormous obstacles for consumers who want to find and understand the privacy terms. Some brands also make inaccurate claims that certain information is not “personal information”, implying the Privacy Act doesn’t apply to that data. > >Some companies are also repurposing personal information for “marketing” or “research”, and sharing data with third parties.
fedilink

>404 Media, along with Haaretz, Notus, and Krebs On Security recently reported on a company that captures smartphone location data from a variety of sources and collates that data into an easy-to-use tool to track devices’ (and, by proxy, individuals’) locations. The dangers that this tool presents are especially grave for those traveling to or from out-of-state reproductive health clinics, places of worship, and the border. > >The tool, called Locate X, is run by a company called Babel Street. Locate X is designed for law enforcement, but an investigator working with Atlas Privacy, a data removal service, was able to gain access to Locate X by simply asserting that they planned to work with law enforcement in the future. > >With an incoming administration adversarial to those most at risk from location tracking using tools like Locate X, the time is ripe to bolster our digital defenses. Now more than ever, attorneys general in states hostile to reproductive choice will be emboldened to use every tool at their disposal to incriminate those exerting their bodily autonomy. Locate X is a powerful tool they can use to do this. So here are some timely tips to help protect your location privacy.
fedilink


>With the looming presidential election, a United States Supreme Court majority that is hostile to civil rights, and a conservative effort to rollback AI safeguards, strong state privacy laws have never been more important. > >But late last month, efforts to pass a federal comprehensive privacy law died in committee, leaving the future of privacy in the US unclear. Who that future serves largely rests on one crucial issue: the preemption of state law. > >On one side, the biggest names in technology are trying to use their might to force Congress to override crucial state-level privacy laws that have protected people for years. > >On the other side is the American Civil Liberties Union and 55 other organizations. We explained in our own letter to Congress how a federal bill that preempts state law would leave millions with fewer rights than they had before. It would also forbid state legislatures from passing stronger protections in the future, smothering progress for generations to come. > >Preemption has long been the tech industry’s holy grail. But few know its history. It turns out, Big Tech is pulling straight from the toxic strategy that Big Tobacco used in the 1990s. Back then, Big Tobacco invented the “Accommodation Program,” a national campaign ultimately aimed at federal preemption of indoor smoking laws. > >Phillip Morris and others in the tobacco industry implemented a three-step strategy which is only known through documents made public in litigation years later. Those documents reveal the inner workings of a nefarious corporate influence machine designed to quietly snuff out a democratic movement that threatened their profits. And now Big Tech is trying to do the same. > >But it’s not too late. We can ensure our civil rights and civil liberties are protected in the digital age. But to defeat Big Tech’s strategy, first we must understand it.
fedilink

>Another month, another attempt: Even though Hungary had to cancel the latest EU Council's vote on the Child Sexual Abuse (CSA) Regulation in June 2024 because there was no majority among member states, it tried again this Wednesday - without success. The tipping point was that the Dutch secret service clearly issued their opinion on the enormous threat to everybody's security should end-to-end encryption be weakened. Encryption is paramount for the digital resilience in Europe.
fedilink

>Depending on where you're based, you'll find PayPal's new data-sharing option under a different name. Remember, you may not see this at all if you're based in a country that doesn't allow it. > >If you're in the US, you should head to your profile Settings and tap on Data & privacy. Under Manage shared info, click on Personalized shopping. You should see the option enabled by default. Toggle off the button at the right to opt-out. > >If you are in the UK like me, you'll see something different after you head to your profile Settings and tap on Data & privacy. > >Under Manage your privacy settings, here you'll see an Interest-based marketing tab – click on it. At this point, two options will appear: Interest-based marketing on PayPal and Internet-based marketing on your accounts. You have to tap on each of these and toggle off the button at the right to opt-out. These instructions can also apply if you're based in the EU.
fedilink

>Tails will be incorporated “into the Tor Project’s structure,” which will allow for “easier collaboration, better sustainability, reduced overhead, and expanded training and outreach programs to counter a larger number of digital threats,” according to a [blog post](https://blog.torproject.org/tor-tails-join-forces/) published today by the Tor Project
fedilink

>LinkedIn users in the U.S. — but not the EU, EEA, or Switzerland, likely due to those regions’ data privacy rules — have an opt-out [toggle](https://www.linkedin.com/mypreferences/d/settings/data-for-ai-improvement) in their settings screen disclosing that LinkedIn scrapes personal data to train “content creation AI models.” The toggle isn’t new. But, as first [reported](https://www.404media.co/linkedin-is-training-ai-on-user-data-before-updating-its-terms-of-service/) by 404 Media, LinkedIn initially didn’t refresh its privacy policy to reflect the data use. >The terms of service have now been [updated](https://www.linkedin.com/legal/privacy-policy), but ordinarily that occurs well before a big change like using user data for a new purpose like this. The idea is it gives users an option to make account changes or leave the platform if they don’t like the changes. Not this time, it seems. >**To opt out of LinkedIn’s data scraping, head to the “Data Privacy” section of the LinkedIn settings menu on desktop, click “Data for Generative AI improvement,” then toggle off the “Use my data for training content creation AI models” option. You can also attempt to opt out more comprehensively [via this form](https://www.linkedin.com/help/linkedin/ask/TS-DPRO), but LinkedIn notes that any opt-out won’t affect training that’s already taken place.** >The nonprofit Open Rights Group (ORG) has called on the Information Commissioner’s Office (ICO), the U.K.’s independent regulator for data protection rights, to investigate LinkedIn and other social networks that train on user data by default. >“LinkedIn is the latest social media company found to be processing our data without asking for consent,” Mariano delli Santi, ORG’s legal and policy officer, said in a statement. “The opt-out model proves once again to be wholly inadequate to protect our rights: the public cannot be expected to monitor and chase every single online company that decides to use our data to train AI. Opt-in consent isn’t only legally mandated, but a common-sense requirement.”
fedilink

Proton Wallet Review: Is Proton Losing Touch? - Privacy Guides
>Proton, the Swiss creators of privacy-focused products like Proton Mail and ProtonVPN, recently released the latest product in their ever-growing lineup: Proton Wallet. Announced at the end of July 2024, it promotes itself as "an easy-to-use, self-custodial" Bitcoin wallet that will ostensibly make financial freedom more attainable for everyone. >It may well be that Proton Wallet is the easiest way to start using Bitcoin, but is a Bitcoin wallet the solution people need to improve their financial privacy? >Contrary to popular belief, cryptocurrency is not an inherently private transactional system. >Had Proton Wallet added support for Monero or a similarly private cryptocurrency, they could have single-handedly boosted a financial system that is actually private by default by a significant degree. In my eyes, failing to do so in favor of the market leader is an unfortunate step back from their "privacy by default" mantra. >Proton Wallet seems like a product that doesn't know its own place in the world. >Is it meant to save us from the tyranny of payment processors like PayPal who can freeze your funds at a whim? >Or, was Bitcoin chosen to give us independence from fiat currency, including stablecoins, entirely? >However, if Proton Wallet wasn't meant for all that, if it was simply meant to bring privacy to Bitcoin, then it's certainly a failure. >Proton hasn't taken any risks with this product, meaning it's really only good for satisfying a singular belief: That Bitcoin is just inherently good, and anything to promote Bitcoin is inherently good as well. I don't share these fanatical beliefs of Bitcoin maximalists, however, when Bitcoin is demonstrably lacking in a wide variety of ways. >Personally, I'm a bit of a cryptocurrency pessimist in general, but I can see some appeal for the technology in very specific areas. Unfortunately, Proton Wallet doesn't seem to fit in to a useful niche in any meaningful way. The functionality it does support is extremely basic, even by Bitcoin standards, and it simply doesn't provide enough value over the existing marketplace. >If you're an existing Proton user simply looking for a place to store some Bitcoin you already have sitting around, Proton Wallet might be perfectly adequate. For everyone else, I don't see this product being too useful. Bitcoin is still far too volatile to be a solid investment or used as a safe store of value if you crave financial independence and sovereignty, and Proton Wallet simply isn't adequate for paying for things privately online.
fedilink


For Android users seeking a privacy-focused browser, [Privacy Guides](https://www.privacyguides.org/en/mobile-browsers/#mull) recommends Mull: >Mull is a privacy oriented and deblobbed Android browser based on Firefox. Compared to Firefox, it offers much greater fingerprinting protection out of the box, and disables JavaScript Just-in-Time (JIT) compilation for enhanced security. It also removes all proprietary elements from Firefox, such as replacing Google Play Services references. >Mull enables many features upstreamed by the Tor uplift project using preferences from Arkenfox. Proprietary blobs are removed from Mozilla's code using the scripts developed for Fennec F-Droid.
fedilink


>repeated media reports of Google’s disregard for the privacy of the general public led to a push for open source, community driven alternatives to Google Maps. The biggest contender, now used by Google’s direct competitors and open source projects alike is OpenStreetMap. 1. OsmAnd > >OsmAnd is a fantastic choice when searching for an alternative to Google Maps. It is available on both Android and iOS devices with both free and paid subscription options. Free accounts have full access to maps and navigation features, but choosing a paid subscription will allow you unlimited map downloads and increases the frequency of updates. > >All subscriptions can take advantage of turn-by-turn navigation, route planning, map markers, and all the favorite features you expect from a map and navigation app in 2024. By making the jump to a paid subscription you get some extra features like topo maps, nautical depths, and even point-of-interest data imported from Wikipedia. 2. Organic Maps > >Organic Maps is a great choice primarily because they offer support for all features of their iOS and Android apps completely offline. This means if you have an old phone laying around, you can install the app, download the maps you need and presto! You now have an indepth digital map in the palm of your hand without needing to worry about losing or damaging your primary mobile device when exploring the outdoors. > >Organic Maps tugs our heartstrings by their commitment to privacy. The app can run entirely without a network connection and comes with no ads, tracking, data collection, and best of all no registration. 3. Locus Maps > >Our third, and last recommendation today is Locus Maps. Locus Maps is built by outdoor enthusiasts for the same community. Hiking, biking, and geocaching are all mainstays of the Locus App, alongside standard street map navigation as well. > >Locus is available in its complete version for Android, and an early version is available for iOS which is continuing to be worked on. Locus Maps offers navigation, tracking and routes, and also information on points-of-interest you might visit or stumble upon during your adventures.
fedilink

Google ads push fake Google Authenticator site installing malware | The ad displays “google.com” and “https://www.google.com” as the click URL, and the advertiser’s identity is verified by Google
>Google has fallen victim to its own ad platform, allowing threat actors to create fake Google Authenticator ads that push the DeerStealer information-stealing malware. >In a new malvertising campaign found by Malwarebytes, threat actors created ads that display an advertisement for Google Authenticator when users search for the software in Google search. >What makes the ad more convincing is that it shows 'google.com' and "https://www.google.com" as the click URL, which clearly should not be allowed when a third party creates the advertisement. >We have seen this very effective URL cloaking strategy in past malvertising campaigns, including for KeePass, Arc browser, YouTube, and Amazon. Still, Google continues to fail to detect when these imposter ads are created. >Malwarebytes noted that the advertiser's identity is verified by Google, showing another weakness in the ad platform that threat actors abuse. >When the download is executed, it will launch the DeerStealer information-stealing malware, which steals credentials, cookies, and other information stored in your web browser. >Users looking to download software are recommended to avoid clicking on promoted results on Google Search, use an ad blocker, or bookmark the URLs of software projects they typically use. >Before downloading a file, ensure that the URL you're on corresponds to the project's official domain. Also, always scan downloaded files with an up-to-date AV tool before executing.
fedilink

>Filed in 2022, the Texas lawsuit said that Meta was in violation of a state law that prohibits capturing or selling a resident’s biometric information, such as their face or fingerprint, without their consent. >The company announced in 2021 that it was shutting down its face-recognition system and delete the faceprints of more than 1 billion people amid growing concerns about the technology and its misuse by governments, police and others. >Texas filed a similar lawsuit against Google in 2022. Paxton’s lawsuit says the search giant collected millions of biometric identifiers, including voiceprints and records of face geometry, through its products and services like Google Photos, Google Assistant, and Nest Hub Max. That lawsuit is still pending. >The $1.4 billion is unlikely to make a dent in Meta’s business. The Menlo Park, California-based tech made a profit of $12.37 billion in the first three months of this year, Its revenue was $36.46 billion, an increase of 27% from a year earlier.
fedilink

>The Kids Online Safety Act (KOSA) easily passed the Senate today despite critics' concerns that the bill may risk creating more harm than good for kids and perhaps censor speech for online users of all ages if it's signed into law. >KOSA received broad bipartisan support in the Senate, passing with a 91–3 vote alongside the Children’s Online Privacy Protection Action (COPPA) 2.0. Both laws seek to control how much data can be collected from minors, as well as regulate the platform features that could harm children's mental health. >However, while child safety advocates have heavily pressured lawmakers to pass KOSA, critics, including hundreds of kids, have continued to argue that it should be blocked. >Among them is the American Civil Liberties Union (ACLU), which argues that "the House of Representatives must vote no on this dangerous legislation." >If not, potential risks to kids include threats to privacy (by restricting access to encryption, for example), reduced access to vital resources, and reduced access to speech that impacts everyone online, the ACLU has alleged. >The ACLU recently staged a protest of more than 300 students on Capitol Hill to oppose KOSA's passage. Attending the protest was 17-year-old Anjali Verma, who criticized lawmakers for ignoring kids who are genuinely concerned that the law would greatly limit their access to resources online. >"We live on the Internet, and we are afraid that important information we’ve accessed all our lives will no longer be available," Verma said. "We need lawmakers to listen to young people when making decisions that affect us."
fedilink

>In a new academic paper, researchers from the Belgian university KU Leuven detailed their findings when they analyzed 15 popular dating apps. Of those, Badoo, Bumble, Grindr, happn, Hinge and Hily all had the same vulnerability that could have helped a malicious user to identify the near-exact location of another user, according to the researchers. >While neither of those apps share exact locations when displaying the distance between users on their profiles, they did use exact locations for the “filters” feature of the apps. Generally speaking, by using filters, users can tailor their search for a partner based on criteria like age, height, what type of relationship they are looking for and, crucially, distance. >To pinpoint the exact location of a target user, the researchers used a novel technique they call “oracle trilateration.” >The good news is that all the apps that had these issues, and that the researchers reached out to, have now changed how distance filters work and are not vulnerable to the oracle trilateration technique. >Neither Badoo, which is owned by Bumble, nor Hinge responded to a request for comment.
fedilink

>A federal district court in New York has ruled that U.S. border agents must obtain a warrant before searching the electronic devices of Americans and international travelers crossing the U.S. border. >The ruling on July 24 is the latest court opinion to upend the U.S. government’s long-standing legal argument, which asserts that federal border agents should be allowed to access the devices of travelers at ports of entry, like airports, seaports and land borders, without a court-approved warrant. >“The ruling makes clear that border agents need a warrant before they can access what the Supreme Court has called ‘a window into a person’s life,’” Scott Wilkens, senior counsel at the Knight First Amendment Institute, one of the groups that filed in the case, said in a press release Friday. >The district court’s ruling takes effect across the U.S. Eastern District of New York, which includes New York City-area airports like John F. Kennedy International Airport, one of the largest transportation hubs in the United States. >Critics have for years argued that these searches are unconstitutional and violate the Fourth Amendment, which protects against unwarranted searches and seizures of a person’s electronic devices. >In this court ruling, the judge relied in part on an amicus brief filed on the defendant’s behalf that argued the unwarranted border searches also violate the First Amendment on grounds of presenting an “unduly high” risk of a chilling effect on press activities and journalists crossing the border. >With several federal courts ruling on border searches in recent years, the issue of their legality is likely to end up before the Supreme Court, unless lawmakers act sooner.
fedilink



>The EU Council has now passed a 4th term without passing its controversial message-scanning proposal. The just-concluded Belgian Presidency failed to broker a deal that would push forward this regulation, which has now been debated in the EU for more than two years. > >For all those who have reached out to sign the “Don’t Scan Me” petition, thank you—your voice is being heard. News reports indicate the sponsors of this flawed proposal [withdrew it because they couldn’t get a majority](https://netzpolitik.org/2024/victory-for-now-no-majority-on-chat-control-for-belgium/) of member states to support it. > >Now, it’s time to stop attempting to compromise encryption in the name of public safety. EFF has [opposed](https://www.eff.org/deeplinks/2022/06/eus-new-message-scanning-regulation-must-be-stopped) this legislation from the start. Today, [we’ve published a statement](https://edri.org/wp-content/uploads/2024/07/Statement_-The-future-of-the-CSA-Regulation.pdf), along with EU civil society groups, explaining why this flawed proposal should be withdrawn. > >The scanning proposal would create “detection orders” that allow for messages, files, and photos from hundreds of millions of users around the world to be compared to government databases of child abuse images. At some points during the debate, EU officials even suggested using AI to scan text conversations and predict who would engage in child abuse. That’s one of the reasons why some opponents have labeled the proposal “chat control.” > >There’s [scant public support](https://edri.org/our-work/press-release-poll-youth-in-13-eu-countries-refuse-surveillance-of-online-communication/) for government file-scanning systems that break encryption. Nor is there [support in EU law](https://www.eff.org/deeplinks/2024/03/european-court-human-rights-confirms-undermining-encryption-violates-fundamental). People who need secure communications the most—lawyers, journalists, human rights workers, political dissidents, and oppressed minorities—will be the most affected by such invasive systems. Another group harmed would be those whom the EU’s proposal claims to be helping—abused and at-risk children, who need to securely communicate with trusted adults in order to seek help. > >The right to have a private conversation, online or offline, is a bedrock human rights principle. When surveillance is used as an investigation technique, it must be targeted and coupled with strong judicial oversight. In the coming EU council presidency, which will be led by Hungary, leaders should drop this flawed message-scanning proposal and focus on law enforcement strategies that respect peoples’ privacy and security. > >Further reading: >- [EFF and EDRi Coalition Statement on the Future of the CSA Regulation](https://edri.org/wp-content/uploads/2024/07/Statement_-The-future-of-the-CSA-Regulation.pdf)
fedilink

>We’ve said it before: online age verification is incompatible with privacy. Companies responsible for storing or processing sensitive documents like drivers’ licenses are likely to encounter data breaches, potentially exposing not only personal data like users’ government-issued ID, but also information about the sites that they visit. > >This threat is not hypothetical. This morning, 404 Media reported that a major identity verification company, AU10TIX, left login credentials exposed online for more than a year, allowing access to this very sensitive user data. > >A researcher gained access to the company’s logging platform, “which in turn contained links to data related to specific people who had uploaded their identity documents,” including “the person’s name, date of birth, nationality, identification number, and the type of document uploaded such as a drivers’ license,” as well as images of those identity documents. Platforms reportedly using AU10TIX for identity verification include TikTok and X, formerly Twitter. > >Lawmakers pushing forward with dangerous age verifications laws should stop and consider this report. Proposals like the federal Kids Online Safety Act and California’s Assembly Bill 3080 are moving further toward passage, with lawmakers in the House scheduled to vote in a key committee on KOSA this week, and California's Senate Judiciary committee set to discuss AB 3080 next week. Several other laws requiring age verification for accessing “adult” content and social media content have already passed in states across the country. EFF and others are challenging some of these laws in court. > >In the final analysis, age verification systems are surveillance systems. Mandating them forces websites to require visitors to submit information such as government-issued identification to companies like AU10TIX. Hacks and data breaches of this sensitive information are not a hypothetical concern; it is simply a matter of when the data will be exposed, as this breach shows. > >Data breaches can lead to any number of dangers for users: phishing, blackmail, or identity theft, in addition to the loss of anonymity and privacy. Requiring users to upload government documents—some of the most sensitive user data—will hurt all users. > >According to the news report, so far the exposure of user data in the AU10TIX case did not lead to exposure beyond what the researcher showed was possible. If age verification requirements are passed into law, users will likely find themselves forced to share their private information across networks of third-party companies if they want to continue accessing and sharing online content. Within a year, it wouldn’t be strange to have uploaded your ID to a half-dozen different platforms. > >No matter how vigilant you are, you cannot control what other companies do with your data. If age verification requirements become law, you’ll have to be lucky every time you are forced to share your private information. Hackers will just have to be lucky once.
fedilink




>iOS apps that build their own social networks on the back of users’ address books may soon become a thing of the past. In iOS 18, Apple is cracking down on the social apps that ask users’ permission to access their contacts — something social apps often do to connect users with their friends or make suggestions for who to follow. Now, Apple is adding a new two-step permissions pop-up screen that will first ask users to allow or deny access to their contacts, as before, and then, if the user allows access, will allow them to choose which contacts they want to share, if not all. >For those interested in security and privacy, the addition is welcome. As security firm Mysk wrote on X, the change would be “sad news for data harvesting apps…” Others pointed out that this would hopefully prevent apps that ask repeatedly for address book access even after they had been denied. Now users could grant them access but limit which contacts they could actually ingest.
fedilink


>“If you’re someone who’s buying products on the web, we know who is buying the products where, and we can leverage the data,” Grether said in a statement to the WSJ. He also said that PayPal will receive shopping data from customers using its credit card in stores. >A PayPal spokesperson tells the WSJ that the company will collect data from customers by default while also offering the ability to opt out. >PayPal is far from the only company to sell ads based on transaction information. In January, a study from Consumer Reports revealed that Facebook gets information about users from thousands of different companies, including retailers like Walmart and Amazon. JPMorgan Chase also announced that it’s creating an ad network based on customer spending data, while Visa is making similar moves. Of course, this doesn’t include the tracking shopping apps do to log your offline purchases, too.
fedilink

>Google’s AI model will potentially listen in on all your phone calls — or at least ones it suspects are coming from a fraudster. > >To protect the user’s privacy, the company says Gemini Nano operates locally, without connecting to the internet. “This protection all happens on-device, so your conversation stays private to you. We’ll share more about this opt-in feature later this year,” the company says. >“This is incredibly dangerous,” says Meredith Whittaker, the president of a foundation for the end-to-end encrypted messaging app Signal. > >Whittaker —a former Google employee— argues that the entire premise of the anti-scam call feature poses a potential threat. That’s because Google could potentially program the same technology to scan for other keywords, like asking for access to abortion services. > >“It lays the path for centralized, device-level client-side scanning,” she said in a post on Twitter/X. “From detecting 'scams' it's a short step to ‘detecting patterns commonly associated w/ seeking reproductive care’ or ‘commonly associated w/ providing LGBTQ resources' or ‘commonly associated with tech worker whistleblowing.’”
fedilink

>With the latest version of Firefox for U.S. desktop users, we’re introducing a new way to measure search activity broken down into high level categories. This measure is not linked with specific individuals and is further anonymized using a technology called OHTTP to ensure it can’t be connected with user IP addresses. > >Let’s say you’re using Firefox to plan a trip to Spain and search for “Barcelona hotels.” Firefox infers that the search results fall under the category of “travel,” and it increments a counter to calculate the total number of searches happening at the country level. > >Here’s the current list of categories we’re using: animals, arts, autos, business, career, education, fashion, finance, food, government, health, hobbies, home, inconclusive, news, real estate, society, sports, tech and travel. > >Having an understanding of what types of searches happen most frequently will give us a better understanding of what’s important to our users, without giving us additional insight into individual browsing preferences. This helps us take a step forward in providing a browsing experience that is more tailored to your needs, without us stepping away from the principles that make us who we are. > >We understand that any new data collection might spark some questions. Simply put, this new method only categorizes the websites that show up in your searches — not the specifics of what you’re personally looking up. > >Sensitive topics, like searching for particular health care services, are categorized only under broad terms like health or society. Your search activities are handled with the same level of confidentiality as all other data regardless of any local laws surrounding certain health services. > >Remember, you can always opt out of sending any technical or usage data to Firefox. Here’s a step-by-step guide on how to adjust your settings. We also don’t collect category data when you use Private Browsing mode on Firefox. >The Copy Without Site Tracking option can now remove parameters from nested URLs. It also includes expanded support for blocking over 300 tracking parameters from copied links, including those from major shopping websites. Keep those trackers away when sharing links!
fedilink

- Mullvad VPN's blog [post](https://mullvad.net/en/blog/dns-traffic-can-leak-outside-the-vpn-tunnel-on-android): DNS traffic can leak outside the VPN tunnel on Android >Identified scenarios where the Android OS can leak DNS traffic: >- If a VPN is active without any DNS server configured. >- For a short period of time while a VPN app is re-configuring the tunnel or is being force stopped/crashes. > >The leaks seem to be limited to direct calls to the C function getaddrinfo. > >The above applies regardless of whether Always-on VPN and Block connections without VPN is enabled or not, which is not expected OS behavior and should therefore be fixed upstream in the OS. > >We’ve been able to confirm that these leaks occur in multiple versions of Android, including the latest version (Android 14). > >We have reported [the issues and suggested improvements](https://issuetracker.google.com/issues/337961996) to Google and hope that they will address this quickly. - GrapheneOS 2024050900 release changelog [announcement](https://grapheneos.org/releases#2024050900): >prevent app-based VPN implementations from leaking DNS requests when the VPN is down/connecting (this is a preliminary defense against this issue and more research is required, along with apps preventing the leaks on their end or they'll still have leaks outside of GrapheneOS)
fedilink

>Bitwarden Authenticator is a standalone app that is available for everyone, even non-Bitwarden customers. >In its current release, Bitwarden Authenticator generates time-based one-time passwords (TOTP) for users who want to add an extra layer of 2FA security to their logins. >There is a comprehensive roadmap planned with additional functionality. >Available for [iOS and Android](https://bitwarden.com/download/#bitwarden-authenticator-mobile)
fedilink

>The EU's Data Protection Board (EDPB) has told large online platforms they should not offer users a binary choice between paying for a service and consenting to their personal data being used to provide targeted advertising. >In October last year, the social media giant said it would be possible to pay Meta to stop Instagram or Facebook feeds of personalized ads and prevent it from using personal data for marketing for users in the EU, EEA, or Switzerland. Meta then announced a subscription model of €9.99/month on the web or €12.99/month on iOS and Android for users who did not want their personal data used for targeted advertising. >At the time, Felix Mikolasch, data protection lawyer at noyb, said: "EU law requires that consent is the genuine free will of the user. Contrary to this law, Meta charges a 'privacy fee' of up to €250 per year if anyone dares to exercise their fundamental right to data protection."
fedilink