AXEL Network Products:

AXEL GO - share and store files securely.

LetMeSee - photo sharing app.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

AXEL.org

  • Network
  • Technology
  • Applications
  • Blog
  • About
    • Team
    • Press
    • Careers
    • Patents
  • Contact Us
  • Login
    • AXEL Go
    • AXEL Cloud

Privacy

February 26, 2021

Should Privacy be a Human Right?

With the advancements in the mass surveillance technology used by governments and corporations, maintaining individual privacy has never been more important. AXEL believes privacy is a fundamental human right that these powerful institutions need to acknowledge. Without a vigorous defense of this position, influential organizations will inevitably erode privacy protections and lead society down a dark, Orwellian path.

Privacy law – not a new thing

Citizens demanding basic privacy is not a new phenomenon. Formal privacy law goes all the way back to 1361 AD in England[1]. Nevermind modern accouterments like cellphones, back then niceties such as plumbing and an easily traversable road system weren’t fathomable. It was the time of King Edward the III, with England and France engaged in what was to be known as ‘The 100 Years War.’ In other words, a LONG time ago.

The Justices of the Peace Act outlawed peeping toms and eavesdroppers under the penalty of imprisonment. It was a way to stop the town weirdo from spying on neighbors from behind a cow or haycart.

Today these concerns seem quaint, as every computer, cellphone, smartwatch, digital assistant, or any other piece of internet-connected technology is the equivalent of an eavesdropping creep. On the plus side, medicine advanced past the practice of bloodletting as a cure-all. So, we’ve got that going for us.

A decree from the United Nations

Fast-forward over half a millennium to 1948. The newly-formed international coalition, the United Nations, released the United Nations Declaration of Human Rights[2]. This short document outlined various human rights for all people. Article 12 states, “No one shall be subjected to arbitrary interference with his privacy, family, home, or correspondence, nor to attack upon his honor and reputation. Everyone has the right to  the protection of the law against such interference or attack.”

While these UN guidelines are clear and concise, they lacked any true enforcement capabilities. Fantastic ideals in theory; often ignored in practice.

United States privacy law history

Unfortunately, The United States Constitution doesn’t explicitly guarantee privacy as a right. However, not all is lost. Throughout the years, there have been legal arguments that other liberties imply privacy rights. Examples include:

  • Stanford Law Review April 2010. A piece in the prestigious legal journal by Orin Kerr outlined an argument that sought to apply the Fourth Amendment to internet privacy[3]. The focus is on police-related intrusions, specifically dealing with warrant requirements for digital surveillance.
  • Griswold v. Connecticut. This 1965 case set the precedent that the Constitution grants privacy rights against government intrusion implicitly from other liberties established in the Bill of Rights[4]. While the case pertained to marital relations, the ruling set a precedent for the more general concept of implicit rights.

The current state of privacy

Two-thirds of countries have privacy regulations on the books[5]. So, everything’s all good, right? Time for privacy advocates to pack it up and celebrate their victory! No, things are not all rainbows and sunshine in this space. In fact, the situation is pretty bad.

Government privacy intrusions

The U.S. government spying on its citizens is nothing new. The practice dates back at least 70 years. Over this time, many groups (political activists, civil rights leaders, union participants, the far-Left, the far-Right, you name it) became surveillance targets of federal agencies like the FBI, CIA, and NSA. However, the devastating 9/11 attacks combined with advancing digital technology created a perfect storm for privacy intrusion at a scale never before seen.

The details of which were outlined by whistleblower Edward Snowden in 2013[6]. Here are a few significant revelations of the leaks:

  • The NSA collected millions of peoples’ cellphone metadata (i.e., when calls are made/to whom) and location information[7]. A federal appeals court finally ruled this tactic illegal in 2020[8].
  • The NSA can easily break internet standard encryption methods to view private emails, financial transactions, and other personal data[9].
  • The NSA implemented a program code-named PRISM where the Big Tech companies would mine user data and turn it over to the agency upon request[10].

These only scratch the surface of the Snowden leaks. The story received enormous press coverage over the years, putting pressure on the federal agencies for more transparency. It is naive to think organizations like the NSA stopped using these tactics, though. After all, the courts didn’t ban illegal phone metadata collection until seven years after initial disclosure, after multiple other scandals[11].

Corporate intrusions

Of course, the government doesn’t have a monopoly on invading peoples’ privacy. Corporations are big players in the game, too (although, as seen in the PRISM program, the two entities can work together.)

Big Tech has a notorious reputation in this regard. Companies such as Facebook, Google, and Amazon collect so much personal data that their algorithms probably know people better than they know themselves.

The most known scandal involved Cambridge Analytica, a Big Data firm that bought user data from Facebook and used it to serve targeted political ads, allegedly resulting in a shift toward Donald Trump’s election[12].

Regardless of that hypothesis’s validity, data mining and selling are an everyday occurrence in Big Tech’s world. All one has to do is read the privacy policies or terms of service agreements the companies provide to get a glimpse at the breadth of knowledge they have about individuals. Easier said than done since those policies are thousands of words of legalese, but decipher them, and it becomes quite creepy.

Tougher legislation

Data privacy and protection are now mainstream topics. As such, some governments are enacting stronger legislation. The Gold Standard of these laws is the General Data Protection Regulation (GDPR) in the European Union. It is the most comprehensive data privacy law to date.

California took the main framework of the GDPR and passed a similar law called the California Privacy Rights Act (CPRA), which will take a few years to implement fully. While these are the best laws currently in effect, they still have loopholes that will undoubtedly lead to exploitation. Do they go far enough to protect everyone’s personal information? Only time will tell.

Be proactive

The GDPR and CPRA are much needed, but people should take matters into their own hands as well. Stop relying on “free” software from the megacorporations and search for privacy-based alternatives.

AXEL Go is the perfect solution for anyone looking for a private, secure file-sharing and storage platform. It has blockchain implementation, runs on the un-censorable InterPlanetary File System, and utilizes military-spec AES 256-bit encryption to ensure your files aren’t compromised. Sign up for a free Basic account and receive 2GB of online storage and enough network fuel for hundreds of typical shares. AXEL truly believes privacy is an inalienable human right. That’s why AXEL Go has industry-leading privacy features that will only get better. Download it today.

 

 

 

[1] English Parliament, “Justices of the Peace Act 1361”, legislation.gov.uk, https://www.legislation.gov.uk/aep/Edw3/34/1

[2] The United Nations, “The Universal Declaration of Human Rights”, un.org, 1948, https://www.un.org/en/universal-declaration-human-rights/#:~:text=Article%2012.,against%20such%20interference%20or%20attacks

[3] Kerr, Orin S. “Applying the Fourth Amendment to the Internet: A General Approach.” Stanford Law Review 62, no. 4 (2010): 1005-049. Accessed February 24, 2021. http://www.jstor.org/stable/40649623

[4] “Griswold v. Connecticut.” Oyez. Accessed February 24, 2021. https://www.oyez.org/cases/1964/496

[5] “Data Protection and Privacy Legislation Worldwide”, UNCTAD, Feb. 4, 2020, https://unctad.org/page/data-protection-and-privacy-legislation-worldwide

[6] Glen Greenwald, “Edward Snowden: the whistleblower behind the NSA surveillance revelations”, The Guardian, June 9, 2013, https://www.theguardian.com/world/2013/jun/09/edward-snowden-nsa-whistleblower-surveillance

[7] Barton Gellman, Ashkan Soltani, “NSA tracking cellphone locations worldwide, Snowden documents show”, The Washington Post, Dec. 4, 2013, https://www.washingtonpost.com/world/national-security/nsa-tracking-cellphone-locations-worldwide-snowden-documents-show/2013/12/04/5492873a-5cf2-11e3-bc56-c6ca94801fac_story.html

[8] Josh Gerstein, “Court rules NSA phone snooping illegal -after 7-year delay”, Politico, Sept. 2, 2020, https://www.politico.com/news/2020/09/02/court-rules-nsa-phone-snooping-illegal-407727

[9] Joseph Menn, “New Snowden documents say NSA can break common Internet encryption”, Reuters, Sept. 5, 2016, https://www.reuters.com/article/net-us-usa-security-snowden-encryption/new-snowden-documents-say-nsa-can-break-common-internet-encryption-idUSBRE98413720130905

[10] Barton Gellman, Laura Poitras, “U.S., British intelligence mining data from nin U.S. Internet companies in broad secret program”, The Washington Post, June 7, 2013, https://www.washingtonpost.com/investigations/us-intelligence-mining-data-from-nine-us-internet-companies-in-broad-secret-program/2013/06/06/3a0c0da8-cebf-11e2-8845-d970ccb04497_story.html

[11] Zack Whittaker, “NSA improperly collected Americans’ phone records for a second time, documents reveal”, Tech Crunch, June 26, 2019, https://techcrunch.com/2019/06/26/nsa-improper-phone-records-collection/

[12] Dan Patterson, “Facebook data privacy scandal: A cheat sheet”, Tech Republic, July 30, 2020, https://www.techrepublic.com/article/facebook-data-privacy-scandal-a-cheat-sheet/

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Front Page Blogs, Privacy Tagged With: big tech, data mining, data privacy, human rights, Privacy

February 19, 2021

Why the Data Localization Movement is Misguided

Data localization, or data residency, is the concept of storing certain data collected on a nation’s citizens within the country of origin at all times. It gained steam after whistleblower Edward Snowden revealed the scope of government mass surveillance in 2013[1]. Governments worldwide enacted data localization legislation to protect state secrets and their citizens’ personal information from the watchful eyes of perceived competitors.

Governments expected and hoped these regulations would bring a host of benefits, including domestic IT job growth, more-hardened national cybersecurity, and increased data privacy. The truth is a bit murky, however, as the desired advantages haven’t materialized.

Countries and regions with data localization laws

First, let’s look into some examples of countries with data residency laws on the books. It is not a comprehensive list but illustrates how many nations are concerned about their data security.

The European Union

The EU’s sweeping data privacy law, the GDPR, sets many expectations for handling sensitive information, such as:

  • Profile data
  • Employment data
  • Financial data
  • Medical and health information
  • Payment data

The GDPR specifies that the above data types stay secured within the EU.  If any transfers are required out of the European Union, the countries receiving the information must have similar privacy regulations.

China

Unsurprisingly, China wants to keep a tight grip on its data. Basically, domestic network operators must store all data within China. They can transfer info across borders, but anything deemed “important” by the government must undergo a security clearance beforehand. What the CCP considers important is fairly broad. It includes:

  • Anything related to national security
  • Information that could identify Chinese citizens

As the country embraces Big Data collection on its citizens[2], you can expect the CCP to strengthen these laws.

Russia

The Russian Federation requires any personal identifying information about its citizens to be stored locally. This could mean:

  • Profile data
  • Financial information
  • Medical and health records

Interestingly, as long as companies initially stored the data in a Russian database, they can send it out of the country for further processing.

Their regulations don’t only apply to domestic organizations. Anyone doing business in the country is subject to the law, so multinational corporations there must have Russia-specific data centers.

These three regions alone account for over a quarter of the world’s population, and there are many more countries with data localization laws.  So, it’s pretty widespread. But what’s the United States’ opinion on the matter?

The United States viewpoint

The United States’ general belief is that data residency laws unduly stifle commerce and don’t offer the expected benefits. Analysts estimate half of the services trade depends on cross-border data flows[3]. With the United States being a service-dominant economy, it makes sense the government would oppose such regulation.

And oppose it, they have! In fact, it has been a point of contention in nearly all of its recent trade deal negotiations, though the EU and Korea have pushed back on outright bans. The USMCA, the North American trade agreement replacing NAFTA, formally prohibits the practice as a condition of doing business[4]. There are similar provisions in the U.S.-Japan Digital Trade Agreement[5] and the U.S.-Kenya Trade Agreement of 2020[6].

So, what are the downsides of data localization that countries like the United States want to avoid?

Technical issues

There is a multitude of technical headaches accompanying data localization. For instance, what if tech personnel in other countries access it regularly for debugging or maintenance purposes? Or, a company uses foreign backup databases for redundancy?

It’s challenging to build separate data centers in all applicable territories, even for large companies with sizable revenues. That makes it downright impossible for even the pluckiest startup to consider. But that should open up markets for smaller, domestic companies, right?

Lack of domestic stimulus

Unfortunately, significant job growth does not occur due to data localization. There are short-term construction jobs available if the data center requires a new building. After that, however, jobs are scarce. This is because the modern data center is mostly automated. The CBRE’s Data Center Solutions Group estimates that the average data center results in between 5-30 permanent, full-time positions[7]. Given the investment required for implementing data residency, it hardly seems worth it based on employment opportunities.

Privacy and security

Well, it has to be more secure and offer more data protection, though! That’s the biggest piece of the benefit pie. Not so fast.

In reality, the exact opposite appears to be true. Regarding privacy, you’d hope that housing data in the country of origin would benefit the citizens. But think back to some of the countries passing data localization laws. Is a full data set of personal information housed in a single jurisdiction good for the people in China? Or Russia? Very debatable. These nations are already surveillance states. Any data housed within their borders is at the control of their totalitarian governments.

Cybersecurity is another issue where expectations don’t match up with the real-world. Consider that these implementations aren’t in a vacuum and that they’ll inevitably cost a significant amount of money. That’s money the company will need to divert from other areas of the business. Cybersecurity could be one of those areas.

Additionally, data residency results in server centralization. This provides a larger attack surface for malicious agents and could ultimately mean more data breaches, not less.

So, paradoxically, data localization could make it easier for state-sponsored threat actors to carry out successful attacks. Combined with the economic inefficiencies, privacy concerns, and technical problems, it becomes plain to see that decentralization is a better path forward. Companies can employ other, less-expensive methods such as end-to-end encryption to protect sensitive information.

The AXEL Network

The AXEL Network is a decentralized, distributed system of servers backed by blockchain technology and the InterPlanetary File System. It gives users a secure, private way to share and store files on the internet. With server nodes located throughout the world, the AXEL Network offers both resiliency and performance. AXEL Go a the next-generation file-sharing platform using the AXEL Network. It combines all of the advantages listed above with optional AES 256-bit encryption to provide exceptional privacy and security. Download it today for Windows, Mac, Android, or iOS and receive a free 14-day trial of our unrestricted Premium service. Enjoy the power of a decentralized, distributed network.

 

[1] Jonah Force Hill, “The Growth of Data Localization Post-Snowden: Analysis and Recommendations for U.S. Policymakers and Business Leaders”, ResearchGate, Jan. 2014, https://www.researchgate.net/publication/272306764_The_Growth_of_Data_Localization_Post-Snowden_Analysis_and_Recommendations_for_US_Policymakers_and_Business_Leaders#:~:text=Abstract,geographies%2C%20jurisdictions%2C%20and%20companies.

[2] Grady McGregor, “The world’s largest surveillance system is growing- and so is the backlash”, Fortune, Nov. 3, 2020, https://fortune.com/2020/11/03/china-surveillance-system-backlash-worlds-largest/

[3] United States International Trade Commission, “Global Digital Trade 1: Market Opportunities and Key Foreign Trade Restrictions”, usitc.gov, Aug. 2017, https://www.usitc.gov/publications/332/pub4716_0.pdf

[4] Agam Shah, Jared Council, “USMCA Formalizes Free Flow of Data, Other Tech Issues”, The Wall Street Journal, Jan. 29, 2020, https://www.wsj.com/articles/cios-businesses-to-benefit-from-new-trade-deal-11580340128

[5] “FACT SHEET ON U.S.-Japan Digital Trade Agreement”, Office of the United States Trade Representative, Oct. 2019, https://ustr.gov/about-us/policy-offices/press-office/fact-sheets/2019/october/fact-sheet-us-japan-digital-trade-agreement

[6] ITI, “ITI: U.S.-Kenya Trade Agreement Can Set New Global Benchmark for Digital Trade”, itic.org, Apr. 28, 2020, https://www.itic.org/news-events/news-releases/iti-u-s-kenya-trade-agreement-can-set-new-global-benchmark-for-digital-trade

[7] John Lenio, “The Mystery Impact of Data Centers on Local Economies Revealed”, areadevelopment.com, 2015, https://www.areadevelopment.com/data-centers/Data-Centers-Q1-2015/impact-of-data-center-development-locally-2262766.shtml

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Business, Cybersecurity Tagged With: cybersecurity, data localization, national security, Privacy, snowden

February 18, 2021

Here’s Why Free Software Can Be a Poison Pill

There was a time when consumer expectations did not demand software be free. Sure, there has always been freeware, but it wasn’t the norm. If someone in the 1980s wanted a word processor, they expected to pay for it!

Today, these expectations have flipped. Why would someone pay for software or web services? Social media platforms are free. Big Tech companies like Google offer free alternatives to traditionally-paid programs such as word processors, spreadsheets, and visual presentation software. What’s the harm? The services are high-quality and users aren’t out a dime. It’s a win-win, right? Well, much like your relationship status during college, it’s complicated.

A costly endeavor

The truth is, software development is expensive. It’s always been expensive. And, even with the proliferation of outsourcing, it remains so today. It is a highly specialized skill requiring considerable knowledge and continued education. The median pay for a developer in the United States was over $107,000 in 2019[1]. Prices for outsourced developers vary by country but expect to pay around $30,000 a year for quality work[2]. Many development teams employ a mixture of domestic and foreign help.

Unlike the 80s, where a small team could complete programs in a basement, now larger units are necessary to deal with the complexities of modern computing. Big Tech’s full-featured products certainly require these sizeable teams of high-cost developers. Their offerings also typically need massive investments in physical infrastructure to keep the services running for millions of potential users. Knowing all this, how do they provide the end products for free? Out of the goodness of the shareholders’ hearts?

The tradeoff

Unsurprisingly, no. Big Tech companies are some of the largest businesses in the world, with billions in yearly revenue. The “free” apps and services they provide do require a form of payment. Your personal data. As the saying goes,” If you aren’t paying for the product, you are the product.”

Today, tech megacorporations collect an absurd amount of data on their users (and in Facebook’s case, even non-users[3].)  The data they find most useful usually falls into the following categories:

  • Email receipts. Who people email consistently can be a wealth of information for data miners.
  • Web activity. Big Tech wants to know which sites everyone visits, how long they stay there, and a host of other browsing metrics. They track across websites, analyze likes and dislikes, and even assess mouse cursor movement.
  • Geolocation. When tracking internet activity isn’t invasive enough, many companies evaluate where people go in the real world. Most don’t understand that their phones’ GPS sensors aren’t strictly used for directions to their Aunt’s new house.
  • Credit card transactions. Purchase records outline a person’s spending habits. Since the entire point of collecting all of this data is to squeeze money out of the user in other ways, this info is extremely valuable.

Imagine the models companies can create of their users, given all of that information. They use these models to personalize advertisements across their platforms. Advertisements more likely to result in sales mean more revenue, so they have an incentive to collect as much data as possible. But that’s not the only way they monetize personal information. Many sell it to third-parties too. Are you creeped out yet?

Alternative data providers

Organizations called ‘alternative data providers’ buy up all of this information, repackage it, and sell it off to whoever wants it (usually financial institutions looking to gain broad insights about the direction of a given market.)

As of 2020, there are over 450 alternative data providers[4], and what happens to your information after they get their hands on it is about as opaque as it gets. This is especially the case in the United States, as there are no federal privacy laws that set clear expectations regarding personal data sales and stewardship. However, there is hope with the passing of California’s new privacy law that Congress will finally tackle the subject.

Privacy policies

One way consumers can stay informed about an organization’s data collection guidelines is to read through its privacy policy and terms of service agreement. There, they can find general information about their practices. Unfortunately, organizations seldom list the specifics (i.e., which companies do they share with or sell the data to, etc.) These documents also tend to be excessively long and filled with confusing legalese. It makes it difficult to extract even basic information and leads to a frustrating user experience.

It’s no wonder that according to a Pew Research survey, only 22% of Americans read privacy policies “always” or “often” before agreeing to them[5]. Most just hit accept without a second thought. We recommend always looking into a company’s privacy policy and terms of service before using their products. If you don’t want to slog through the jargon, try out ToS;dr, a website that breaks down these documents into readable summaries. They also give Big Tech companies “privacy grades” based on what they find. A few examples include: (note: “E” is the lowest grade)

  • Facebook – E. Big surprise here. The company that stores data, whether the person has an account or not, did not score well.
  • Amazon – E. Although online retail is their bread and butter, Amazon also dabbles in providing free apps and services such as the Kindle App. They track people across websites and sell consumer data to third parties, among other egregious tactics.
  • Google – E. Google collects biometric data, shares info with third parties, retains data after erasure requests, and much more.

Search for your favorite social media platform or Big Tech service and see how it stacks up. Spoiler alert: probably not very well.

Another consideration

Open source projects have a poor reputation for cybersecurity since the developers are unpaid and less motivated to provide reliable support. Conversely, free Big Tech products typically get a pass on those risks. After all, their software is well-funded and receives developer support throughout its entire lifespan. This minimizes a few crucial points, though.

First, large tech corporations benefit immensely from a built-in following and the integrated marketing apparatuses at their disposal. This attracts a significantly higher baseline of users for any given service than a startup’s equivalent solution.  These massive user bases attract cybercriminals.

This leads to the second point; while these companies support their products and offer cybersecurity patches regularly, there will always be vulnerabilities. The services almost always run on centralized server farms, making for an enormous attack surface. And the products with the most users will always be the primary targets for phishing scams. So, it’s kind of a paradox. More marketing, support, and users lead to more attacks.

File sharing app examples

There are countless examples of vulnerabilities found in Big Tech apps and services, but here are a few examples in the file-sharing sector:

Google Drive: In the Fall of 2020, threat actors exploited a flaw in Google Drive to send push notifications and emails to users[6]. The messages contained malicious links containing dangerous malware. The situation affected hundreds of thousands of users.

Microsoft OneDrive: Although not officially breached, in April 2020, Microsoft announced a critical vulnerability in their OneDrive cloud app[7]. They quickly released a security fix, but it is unknown if hackers knew about the vulnerability beforehand or if they breached unpatched systems after Microsoft disclosed it.

Dropbox. In 2012, a hacker stole login credentials to over 68 million Dropbox users and sold them on the Dark Web. As if this weren’t bad enough, it took Dropbox three years to disclose the breach! So, during that time, nearly 70 million users were in danger.

ShareIt. This platform may be lesser-known in the United States, but it has 1.8 billion users worldwide and is very popular throughout Asia and Russia. A recent security audit found crucial exploits that could result in hackers stealing sensitive data[8]. Its website doesn’t even default to HTTPS, meaning security doesn’t seem to be a priority for the development team.

In conclusion, free platforms from multibillion-dollar corporations can be dangerous from both data collection and cybersecurity standpoints. Consumers should do their research and consider paying a small fee for privacy and security-focused competitors.

AXEL Go

AXEL is dedicated to giving data custody back to the user. We never sell personal information to third parties or mine accounts. Our file-sharing application, AXEL Go, utilizes blockchain technology, the InterPlanetary File System, and AES 256-bit encryption to provide the most secure cloud-sharing experience in the industry.

Sign up for AXEL Go and receive a free 14-day trial of our Premium service. Premium accounts receive five times more online storage than the Basic account, along with more security options and no restrictions on file sizes. After the trial, users pay $9.99/month to continue the Premium service or downgrade to the Basic account. So, stop worrying and share your documents securely with AXEL Go.

 

 

 

[1] “Occupational Outlook Handbook: Software Developers”, U.S. Bureau of Labor Statistics, 2019, https://www.bls.gov/ooh/computer-and-information-technology/software-developers.htm

[2] Julia Kravchenko, “How Much Does It Cost to Hire Developers: Software Developer Salary Guide 2018”, Hackernoon.com, March 12, 2018, https://hackernoon.com/how-much-does-it-cost-to-hire-developer-software-developer-salary-guide-2018-590fb9e1af2d

[3] Kurt Wagner, “This is how Facebook collects data on you even if you don’t have an account”, Vox, April 20, 2018, https://www.vox.com/2018/4/20/17254312/facebook-shadow-profiles-data-collection-non-users-mark-zuckerberg

[4] Rani Molla, “Why your free software is never free”, Vox, Jan. 29, 2020, https://www.vox.com/recode/2020/1/29/21111848/free-software-privacy-alternative-data

[5] Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar, Erica Turner, “Americans and Privacy: Concerned, Confused And Feeling Lack Of Control Over Their Personal Information”, Pew Research Center, Nov. 15, 2019, https://www.pewresearch.org/internet/2019/11/15/americans-attitudes-and-experiences-with-privacy-policies-and-laws/

[6] Lindsey O’Donnell, “Scammers Abuse Google Drive to Send Malicious Links”, threatpost, Nov. 2, 2020, https://threatpost.com/scammers-google-drive-malicious-links/160832/

[7] Davey Winder, “Windows OneDrive Security Vulnerability Confirmed: All You Need To Know”, Apr. 15, 2020, https://www.forbes.com/sites/daveywinder/2020/04/15/windows-onedrive-security-vulnerability-confirmed-all-you-need-to-know/?sh=517e144b6fa3

[8] Ron Amadeo, “’ShareIt’ Android app with over a billion downloads is a security nightmare”, ars Technica, Feb. 16, 2021, https://arstechnica.com/gadgets/2021/02/shareit-android-app-with-over-a-billion-downloads-is-a-security-nightmare/

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Business, Cybersecurity, Tech Tagged With: big data, big tech, data collection, data privacy, free software, freeware, Privacy

January 22, 2021

Sharing user data with Facebook? WhatsApp with that?

Facebook-owned WhatsApp is receiving backlash for recent changes to its privacy policy. The topic has started an international conversation about the expectation of privacy and consumer data rights. We summarize the situation and how the fallout is pushing the discussion forward.

The application

WhatsApp is the most popular messaging application, with over 2 billion active monthly users[1]. After Facebook purchased it in February 2014, privacy advocates became rightfully concerned. At the time, WhatsApp assured users it would not allow data sharing between the two companies. However, two short years later, in 2016, WhatsApp modified its terms and conditions to enable data sharing[2]. There was a grace period for users to opt-out of the sharing, but that option has long since expired.

This concerning development was partially offset by WhatsApp’s implementation of end-to-end encryption for messages. End-to-end encryption means that only the intended recipient’s device can decrypt messages from the sender. No third party can read or mine your messages. Conceived in 2014, the feature didn’t receive full integration until 2016. Since then, privacy worries mostly dissipated even though their relationship with Facebook never changed. Until they released a privacy policy update in January 2021…

Breakdown of privacy policy changes

So, what nefarious language did WhatsApp include that triggered a backlash? If anything, it was minor updates to already-existing policies. We believe these policies should have drawn ire long before now, but better late than never. According to the policy, WhatsApp could already share the following information with Facebook[3]:

  • Phone numbers of both users and their contacts
  • Profile names and pictures
  • Metadata, including app logs, status messages (i.e., when a user last logged in), and diagnostics information.

The new policy only expands upon this specifically when communicating with businesses. Facebook now stores user chats with companies. They can also access any data within those chats[4]. Certainly not ideal, but perhaps the reaction wouldn’t be so severe had they not required users to accept the changes by February 2021 or face account deletion. The combination of scary words such as “data collection,” “sharing,” and “Facebook” was exacerbated by an equally-frightening ultimatum. It came across as a power play rather than an update. Needless to say, people were not happy.

Harsh backlash

This image has an empty alt attribute; its file name is fastCompanyHeadline-1024x374.pngThe backlash to the update was immediate. It became highly-publicized, with sensational headlines clogging up all of the internet’s many tubes.

 

Then, celebrities took to Twitter to promote privacy-based alternatives such as Signal.

Use Signal

— Elon Musk (@elonmusk) January 7, 2021

Everybody can get back to uninstalling #Whatsapp now. https://t.co/dclPkSaWjH

— Edward Snowden (@Snowden) January 17, 2021

 

This image has an empty alt attribute; its file name is independentHeadline-1024x460.pngThe hysteria around the policy announcement, along with the solicitation of alternatives from people such as Elon Musk, drove people to other encrypted messaging applications in droves. The open-source Signal app received the most significant boost. It is estimated that had at least 40 million new downloads within a week of the WhatsApp update.

Likewise, another private messaging client, Telegram, saw similar gains. In three days, they signed up 25 million new people for their service.

This image has an empty alt attribute; its file name is economicTimesHeadline-1024x374.pngThese substitute solutions are attractive due to their end-to-end encryption capabilities and the fact that Facebook, one of the biggest privacy offenders around, isn’t involved at all. Both companies have more transparent privacy policies and offer compelling products. Time will tell if the poached users migrate back to WhatsApp or if the trend continues.

Signal experiences difficulties

Gaining tens of millions of new users in for a bandwidth-intensive service is going to strain servers. While Telegram already had a massive user base and could withstand a short-term spike in usage, Signal had significant problems.

The Signal Foundation is a nonprofit organization that relies on private funding and donations from users. Interestingly enough, former WhatsApp co-founder, Brian Acton, is on Signal’s Board of Directors and remains one of its biggest funders[7]. Given its more “plucky underdog” status, it makes sense that the enormous increase in traffic caused issues. Within a week of its newfound popularity, the app experienced downtime and lost messages[8].

Consumers tend not to be sympathetic to poor user experiences. For the sake of all privacy apps, we hope that Signal can meet demand and deliver a great experience going forward. If people associate privacy-based alternative applications as “less than,” they’ll migrate back to the services they know.

WhatsApp combats misinformation

Undoubtedly feeling the heat, WhatsApp responded by clarifying the new policy and reassuring that they don’t share most data with Facebook[9]. To informed privacy advocates, this seems more like damage control than anything else. While this update didn’t have a significant amount of new information aside from the Businesses section, it shed light on an ongoing concern about how they share information with Facebook.

A new path forward

The WhatsApp controversy is encouraging. It shows that privacy issues can move the needle, demand mainstream media coverage, and cause tens of millions of people to switch to better solutions. In a time of corporate surveillance, government intrusion, and censorship, it’s nice to see everyday people begin to wake up. We hope this trend continues and the right to privacy becomes a  standard consideration for app developers and service providers.

AXEL believes in the users’ right to privacy and data custody. Our products embody this philosophy. Our blockchain-based, decentralized cloud storage and file-sharing platform, AXEL Go, lets you store or send files confidentially. We don’t sell your information to advertisers or mine your files for data. It offers AES 256-bit encryption to keep your documents away from any would-be spies. Try it out today and receive 2GB of free storage and enough of our AXEL Tokens to fuel thousands of typical shares. The future doesn’t have to be mass surveillance and constant data breaches. We’re providing a different path. Won’t you join us?

 

[1] J. Clement, “Number of monthly active WhatsApp users worldwide from April 2013 to March 2020”, statista, April 30, 2020, https://www.statista.com/statistics/260819/number-of-monthly-active-whatsapp-users/#:~:text=As%20of%20March%202020%2C%20WhatsApp,billion%20MAU%20in%20February%202016

[2] Natasha Lomas, “WhatsApp’s privacy U-turn on sharing data with Facebook draws more heat in Europe”, TechCrunch, Sept. 30, 2016, https://techcrunch.com/2016/09/30/whatsapps-privacy-u-turn-on-sharing-data-with-facebook-draws-more-heat-in-europe/

[3] “WhatsApp Privacy Policy”, WhatsApp.com, July 20, 2020, https://www.whatsapp.com/legal/privacy-policy?eea=0

[4] Andrew Griffin, “WHATSAPP NEW PRIVACY TERMS: WHAT DO NEW RULES REALLY MEAN FOR YOU?”, Independent, Jan. 9, 2021, https://www.independent.co.uk/life-style/gadgets-and-tech/whatsapp-new-privacy-terms-facebook-rules-explained-b1784469.html

[5] Saheli Roy Choudhury, “Indian ministry reportedly asked WhatsApp to drop privacy policy changes that sparked backlash”, CNBC, Jan. 19, 2021, https://www.cnbc.com/2021/01/20/india-has-reportedly-asked-whatsapp-to-withdraw-privacy-policy-update.html

[6] Tugce Ozsoy, Firat Kozok, “WhatsApp Dropped by Erdogan After Facebook Privacy Changes”,

[7] Andy Greenberg, “WhatsApp Co-Founder Puts $50M Into Signal To Supercharge Encrypted Messaging”, Wired, Feb. 2, 2018, https://www.wired.com/story/signal-foundation-whatsapp-brian-acton/

[8] Katie Canales, “Signal appears to be down for some users after the messaging app saw a record spike in downloads”,  Business Insider, Jan. 15, 2021, https://www.businessinsider.com/signal-app-down-users-report-messages-sending-problems-outage-2021-1

[9] “Answering your questions about WhatsApp’s Privacy Policy”, WhatsApp, Jan. 2021, https://faq.whatsapp.com/general/security-and-privacy/answering-your-questions-about-whatsapps-privacy-policy

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Culture, Industry Related Tagged With: facebook, Privacy, signal, telegram, whatsapp, whatsapp privacy policy

November 20, 2020

What’s Inside California’s New Privacy Regulations

On November 3, 2020, California voters approved the California Privacy Rights Act (CPRA or Prop 24), a ballot initiative expanding consumer privacy protections. It easily passed, securing over 56% “Yes” votes. We look into some of its major provisions and examine how it differs from a previous California privacy law.

An amendment to current regulations

In 2018, the California Consumer Privacy Act (CCPA) passed and became law. While it outlined a framework for many consumer privacy protections, many felt it was inadequate given the current state of corporate data collection. So, a mere two years later (and less than one year after the CCPA officially went into effect), the CPRA has made significant changes to these stipulations.

An overview of the changes

Here is a brief summary of the significant changes. You can view the full bill here if you enjoy reading 50 pages of legalese (hey, everyone has their preferences).

A higher threshold for mandated compliance

The CCPA required businesses that used 50,000 consumers’ or households’ personal information to comply with the bill’s privacy standards. The CPRA actually increases this number to 100,000 consumers or households. So, it lessens the regulatory burden on small to medium-sized businesses who traffic in personal information.

Is this a win for privacy advocates? It’s unclear. Nobody wants to shutter small businesses due to onerous regulation, but could these exemptions lead to exploitation? While the biggest privacy offenders such as Facebook and Google will fall under the regulatory umbrella, smaller companies get a free pass. Could this create a loophole where corporations spin their data collection arms off into smaller shell companies to avoid compliance? Until governments and organizations address these possibilities, it remains a concern.

A wider net

CCPA restrictions applied to companies receiving 50% or more of their revenue from selling personal data. This seemingly straightforward wording created a giant loophole for the serial data offenders. In many cases, corporations argued they didn’t actually “sell” personal information. They simply gave it away to increase advertising revenue.

The CPRA closes this loophole by injecting the term “sharing” into the clause. As defined by the bill: “sharing, renting, releasing, disclosing, disseminating, making available, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to a third party for cross-context behavioral advertising, whether or not for monetary, or other valuable consideration…” results in mandatory compliance (assuming the other qualifiers are also met). This is a much more encompassing definition and an overall win for privacy advocates.

New data categories

Whereas the CCPA treated most personal information generally, the CPRA creates more granular data categories with distinct regulatory differences. Specifically, the CPRA defines certain types of data as being “Sensitive Personal Information.” This includes:

  • Government identifiers such as social security numbers or driver’s licenses
  • Financial accounts and login information
  • Detailed geolocation data
  • Info regarding race, religion, philosophical beliefs, or sexual preference
  • Union membership status
  • The content of private mail, email, and text messages
  • Genetic information
  • Biometric data
  • Health records

Consumers can now request that businesses limit the use of their Sensitive Personal Information to only what is necessary to provide the desired services. Companies would then no longer be able to sell or share sensitive information without prior consent and authorization.

It also sets up disclosure and opt-out standards for the use of Sensitive Personal Information that organizations must follow. This includes providing opt-out links on their businesses’ homepage and respecting opt-out signals sent by the consumers when they visit their site.

Expanded consumer rights

The CPRA outlines new privacy rights and modifies others already defined in the CCPA. Examples include:

The right to correction. Consumers can now demand businesses update their personal information if it’s inaccurate.

The right to opt-out of profiling. Data collectors use your personal information to construct a “profile” of you, then utilize automated decision-making technology to serve advertisements based on the profile. The CPRA allows consumers to opt-out of this practice.

An expanded right-to-know. Previously, the CCPA entitled consumers to information collected on them for the past 12 months. The CPRA entitles residents to all data collected.

Greater protection for minors. Businesses that collect and sell the personal information of minors under the age of 16 are subject to triple fines per incident, or $7500.

A more robust right to delete. The CPRA strengthens Californians’ right to delete their personal information. Companies now not only must delete the data but inform third parties they’ve shared or sold the data to of the deletion request as well. Note, the right to delete is subject to certain conditions and exemptions.

A new government agency

Under the CCPA, enforcement falls under the California Attorney General’s responsibilities. This bill creates a dedicated government agency that will handle enforcement and penalties. California sure does love their government agencies! It’s called the California Privacy Protection Agency (CPPA); don’t worry if you can’t keep all the acronyms straight. The CPPA will have a $5 million budget in 2021, which will increase to $10 million from 2022 on.  Its creation will theoretically lessen the burden on the Attorney General’s office and make enforcement more feasible.

Regular audits

Another important provision of the bill is the requirement for companies to audit their cybersecurity practices. As the constant hacks over the past few years have shown, problems lie not only in data collection but also in data protection. Sensitive information needs to be secured with baseline standards to prevent future phishing attacks, cyber theft, and identity fraud.

Organizations must present the findings from these audits to the newly-formed CPPA on a “regular basis.” Hopefully, this incentivizes companies working with private data to invest more in their cybersecurity solutions and reduce data breaches.

Opposition

The CPRA is a controversial bill, with a diverse set of proponents and opponents. However, the opponents may not be who you’d imagine. While one might assume that the big technology corporations in Silicon Valley aren’t too happy with the bill, none came out in outright opposition. There are two common explanations for this:

  • Nobody in Big Tech wants to come out against consumer privacy explicitly. Facebook, Google, and the other tech players have all had their share of bad publicity regarding privacy concerns over the past few years. Saying, “Oh yeah, we want all of your data and don’t want you to have any recourse against it,” likely wouldn’t play well to the general user.
  • Big Tech has sunk its digital claws into the legislation and weakened it considerably. This is actually the standard line for many of those who have come out against it.

Surprising opponents include the California American Civil Liberties Union[1], Consumer Action[2], and the California League of Women Voters[3].

A Frequently cited concern

Those opposing the bill have similar problems with it. They conclude it’s a “pay-for-privacy” scheme that unfairly affects people without the financial means to pay. This is because a clause in the legislation says that a company can charge a consumer requesting privacy the amount of the collected data’s value. It helps tech organizations offset the advertising revenue lost and is a clear motivation for consumers to opt-in to data collection.

An unclear future

Though not everyone agrees that the CPRA is the best possible solution, it’s difficult to argue it isn’t more substantial than the CCPA. It will be fascinating to see the legislation’s future effects on the tech business and consumer privacy. If successful, it could set in motion a slew of similar bills in other states. If it becomes a bureaucratic quagmire, it might stall regulation throughout the country.

One quirk of the CPRA is that lawmakers can no longer amend it unless the amendment is to “further privacy rights.” That may sound good, but its nebulous wording could open up legal challenges down the road if aspects of it need adjustment.

AXEL’s commitment

At AXEL, we believe in everyone’s right to privacy. That’s why we develop file-sharing and cloud storage solutions that prioritize privacy and security. No government-enforced edicts are necessary for us to respect your personal information. It’s an integral component of our corporate philosophy. If you need to share or store files in a safe, private way, download AXEL Go for Windows, Mac, Android, or iOS. Get out from under the watchful eye of Big Tech and experience a better way to use the internet.

 

[1] Andrea Vittorio, “ACLU Among Activist Opposing Update to California Privacy Rules, Bloomberg Law, July 22, 2020, https://news.bloomberglaw.com/privacy-and-data-security/aclu-among-activists-opposing-update-to-california-privacy-rules

[2] Alegra Howard, Linda Sherry, “Consumer Action opposes California Proposition 24”, consumer-action.org, Aug. 19, 2020, https://www.consumer-action.org/press/articles/consumer-action-opposes-california-proposition-24

[3] “League of Women Voters Opposes Prop 24”, prnewswire, Oct. 28, 2020, https://www.prnewswire.com/news-releases/league-of-women-voters-opposes-prop-24-301162344.html

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Legal Tagged With: cpra, data privacy, personal information, Privacy, prop 24

October 30, 2020

You Can’t Crack Good Encryption But You Can EARN IT

Encryption is a hotly debated topic these days. Privacy advocates love it; governments and law enforcement are less enthusiastic. One of the most significant discussions regarding encryption at the moment is the United States’ EARN IT Act. This controversial piece of legislation could have major privacy implications moving forward.

The EARN IT Act’s journey

On March 5, 2020, a bipartisan group of U.S. politicians, including Sen. Lindsey Graham (R-South Carolina), Sen. Richard Blumenthal (D-Connecticut), Sen. Dianne Feinstein (D-California), and Sen. Josh Hawley (R-Missouri) introduced the EARN IT (Eliminating Abusive and Rampant Neglect of Interactive Technologies) Act. The legislation aimed to curb online child sexual exploitation through the creation of a national commission.

The commission

The act establishes a government commission consisting of 19 appointed individuals from various sectors. It includes high-ranking officials from the Department of Justice, the Department of Homeland Security, the Federal Trade Commission, as well as representatives from top law enforcement agencies, constitutional law experts, survivor groups, and more.

The commission would be responsible for devising a set of “best practices” that online companies would need to follow to maintain immunity from liability regarding third-party content posted on their platform. Congress would review and approve the list of mandated best practices. Once approved, the commission would need to certify companies as compliant with the policies before they received immunity. Simply put, immunity is not guaranteed. Online organizations would have to “earn it” (see what they did there?)

Businesses that do not follow the standard set of best practices would need to prove they have reasonable alternative methods to prevent child exploitation on their platform. As deemed by the commission, those who do not meet the minimum standards would be liable for lawsuits from sexual exploitation victims.

Amendments to the bill

This summer, while making its way throughout the Senate Judiciary Committee, lawmakers altered the bill to empower the states to form their own rules. The commission would still be retained along with its guidelines for best practices. However, it is now up to the states to bring civil and criminal lawsuits against content platforms that don’t do enough to prevent child exploitation.

In either form, the EARN IT Act, at its core, attempts to erode the legal protections stipulated by Section 230 of the Communications Decency Act of 1996. And It could create obstacles for the use of encryption technologies.

Section 230

The Communications Decency Act of 1996 is a component of the more comprehensive Telecommunications Act of 1996. This was the first law that incorporated the Internet into broadcast regulations. Section 230 of the CDA states:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

This means that content platforms aren’t liable for the content people post on them. It protects them from all sorts of nasty legal situations.

The most current form of the EARN IT Act affords states more leeway to decide whether a content platform is culpable for sexual crimes committed against minors.

The effect on encryption

So, how does this relate to encryption? If passed, the EARN IT Act significantly weakens the utility of it. The first iteration never specifically mentioned encryption, although the implications to the technology were evident. If, for instance, the government held social media websites liable for facilitating child exploitation via encrypted messages, why would the platform ever allow encrypted messages in the first place?

The whole point of encryption is that the centralized platform doesn’t have the keys to decrypt messages between two private parties. This ensures privacy and that Big Brother isn’t watching over your shoulder. Section 230 prevented roadblocks to encrypted communications. But, if the government can hold the content of encrypted messages against a business in civil or criminal cases, the organization has a massive incentive not to offer encryption services.

The amended EARN IT Act that passed through the Senate Judiciary Committee does mention encryption. In fact, it stipulates that end-to-end encryption by itself is not a reason to remove the Section 230 protections for a company. On the surface, this looks like a more reasonable bill. However, it suggests that organizations scan messages before being encrypted to check for suspicious exploitative content. If any is present, they would have to forward them to the proper government authority for closer scrutiny. The practice is called “client-side scanning.”

So, would this really allow for end-to-end encryption? It appears to undermine its usefulness when companies scan every message before transmission.

Far-reaching consequences

AXEL is a data custody and privacy advocate. Our file sharing and storage platform, AXEL Go prioritizes privacy and security. We provide the option to use encrypted password protection for all shared files.

We understand that this is a complex issue, and we want to prevent the exploitation of minors. However, this legislation could have a chilling effect on privacy and the future of encryption.

Encryption is a tool. It isn’t only useful for criminals. Privacy is a right for everyone, and this technology helps facilitate it. It doesn’t just hide your data from governments and corporations, but also malicious agents. Data breaches happen on a daily basis. If the hackers only score encrypted data, the haul ends up being useless. It helps prevent identity theft, as well as stolen credentials and payment information. Encryption is a part of the solution, not the problem. We can usher in a better online experience. One that isn’t fraught with invasions of privacy and data collection. Client-side scanning of all messages is not on the path toward this future.

If you’d like a secure, private file sharing and storage platform, download AXEL Go. It’s an easy-to-use program available on Windows, Mac, iOS, and Android devices. It uses secure technologies such as blockchain, the InterPlanetary File System (IPFS), and the aforementioned password encryption to ensure your data stays safe and confidential. Sign up for one of our free, Basic accounts and you will receive 2GB of free online storage, along with enough of our AXEL Tokens to fuel thousands of shares across our decentralized network.

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Legal Tagged With: EARN IT Act, encryption, encryption law, Privacy

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3
  • Page 4
  • Page 5
  • Page 6
  • Interim pages omitted …
  • Page 8
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • AXEL News Update
  • AXEL Events
  • Biggest Hacks of 2022 (Part 2)
  • Biggest Hacks of 2022 (Part 1)
  • The State of Government Cybersecurity 2022

Recent Comments

  • Anonymous on Five Simple Security Tricks

Footer

Sitemap
© Copyright 2024 Axel ®. All Rights Reserved.
Terms & Policies
  • Telegram
  • Facebook
  • Twitter
  • YouTube
  • Reddit
  • LinkedIn
  • Instagram
  • Discord
  • GitHub