AXEL Network Products:

AXEL GO - share and store files securely.

LetMeSee - photo sharing app.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

AXEL.org

  • Network
  • Technology
  • Applications
  • Blog
  • About
    • Team
    • Press
    • Careers
    • Patents
  • Contact Us
  • Login
    • AXEL Go
    • AXEL Cloud

law

September 17, 2021

Convenient or Monopolistic? Epic’s Challenge to Apple’s “Walled Garden”

On August 13, 2020, Epic Games, the developer and publisher of the massively popular online game Fortnite, tried something that most companies would be too scared to do. They picked a fight with Apple. On that day, Epic announced a 20% discount on “V-Bucks,” Fortnite’s in-game currency, but only if they purchase it directly from Epic, rather than through Apple’s App Store.

This was an intentional violation of Apple’s terms of service, as Apple takes a 30% commission of all in-app purchases, and Epic wanted that extra money for itself. Within hours, Apple took Fortnite off the App Store for violating its terms of service, with a lawsuit by Epic quickly following [1].

On September 10, 2021, that lawsuit received a ruling. The judge sided with Apple on nine of ten counts, but ordered Apple to loosen restrictions on alternative payment options [2]. However, Apple CEO Tim Cook still stated that, even if an app uses a non-Apple payment option, Apple would still invoice the 30% commission [3]. So, what’s next? Epic appealed the ruling, but for now, Apple still maintains tight control over the apps on its App Store. Ultimately, this case highlights the uniqueness of Apple’s software philosophy, and how its relationships with third-party developers frequently draw ire.

A Walled Garden

For years, Apple’s software philosophy has been described as a “walled garden.” This means that Apple’s software is simple, secure, and easy to use for the consumer. However, Apple also strongly dissuades or even forbids users and developers from leaving their walled garden. Apple states that this approach is necessary to protect its users, and also to differentiate itself from Android, a competitor with a more open ecosystem [4]. Ultimately, this leads to increased simplicity for the user, along with increased dependence on Apple software. So while this approach does protect users from dubious third parties, it also entraps users into Apple’s ecosystem as well.

While Apple claims that its walled garden approach is to offer increased security and simplicity for its users, there are other reasons why Apple uses this philosophy. Because Apple has full control of its ecosystem, it can enforce practically any rule it wants. This includes a 30% commission on in-app purchases. Unfortunately, for third-party developers, this means putting up with Apple’s demands or risk getting kicked out of the garden. And that’s exactly what happened with Epic Games.

The Legal Argument

The main conflict of Epic Games vs. Apple focused on whether Apple’s walled garden approach violates antitrust law. Specifically, Apple’s requirement to force users to only purchase in-game items through the App Store, rather than through another party, was used as evidence of monopolistic behavior [2]. On the other hand, Apple argued that they are free to do business (or not do business) with any other company, and that their restriction of third-party payment services was within their rights as a business. Simply put, this case pitted first-party hardware and third-party software developers against one another.

Ultimately, the court ruled with Apple on nine of ten counts, with Epic stating their intention to appeal their decision [2]. In the one ruling against Apple, Judge Yvonne Gonzalez Rogers stated that “Apple created a new and innovative platform which was also a black box. It enforced silence to control information and actively impede users from obtaining the knowledge to obtain digital goods on other platforms. Apple has used this lack of knowledge to exploit its position [2].” However, because the judge ruled in favor of Apple in the other nine counts, few changes are likely to occur.

While there was potential for a landmark ruling that would shake Apple to its core, the actual ruling that was handed down will likely not have a massive effect on either company. The only change Apple must make is to allow developers to use third-party payment services. However, nothing is stopping Apple from collecting the 30% commission from those third-party developers. Ultimately, while this court ruling had the potential for massive change, the judge’s ruling ensured that Apple’s walled garden philosophy will continue.

Security and Your Rights

While Apple argued that its App Store policies were there to protect users, we know that isn’t the main reason for those restrictive rules. Simply put, the purpose of Apple’s walled garden approach is to keep users locked into the Apple ecosystem. While some users do prefer this method, and it can protect users from unsavory third-party developers, it still infringes upon the rights of consumers.

Unfortunately, this philosophy is all too common with Big Tech companies. Sacrificing privacy is a big win for Big Tech, but a huge loss for privacy rights. Corporations continue to collect hoards of personal data to sell to advertisers, while your privacy is violated. With Amazon, Google, and others offering endless new ways to collect your data, it’s fair to ask: Are you the customer, or the product?  

Thankfully, there are businesses that prioritize security and personal rights. That’s where AXEL comes in. AXEL believes that privacy is a human right. With this in mind, we created AXEL Go, a secure file-sharing and storage software. Offering industry-leading encryption and decentralized blockchain technology, AXEL Go is the best way to protect yourself or your business from unauthorized cybercriminals. With AXEL Go, there’s no compromise between security and privacy rights. After all, our business is protecting your data, not collecting it. If you’re ready to try the most secure file-sharing and storage software, get two free weeks of AXEL Go here. 

[1] Statt, Nick. “Apple Just Kicked Fortnite off the App Store.” The Verge. August 13, 2020. https://www.theverge.com/2020/8/13/21366438/apple-fortnite-ios-app-store-violations-epic-payments.

[2] Newman, Daniel. “Does The Epic Ruling Open The Door For Apple’s Competition?” Forbes. September 16, 2021. https://www.forbes.com/sites/danielnewman/2021/09/16/does-the-epic-ruling-open-the-door-for-apples-competition/.

[3] Adorno, José. “Apple Can Still Charge Its App Store 30% Fee Even after Epic Ruling, Analysts Say.” 9to5Mac. September 14, 2021. https://9to5mac.com/2021/09/14/apple-can-still-charge-its-app-store-30-fee-even-after-epic-ruling-analysts-say/.


[4] Beres, Damon. “All the New Ways Apple Is Trying to Take Over Your Life.” Slate Magazine. June 08, 2021. https://slate.com/technology/2021/06/apple-wwdc-ios15-new-features-walled-garden.html.

Filed Under: Business, Legal Tagged With: apple, big tech, law, lawyer, privacy law

August 30, 2021

Data Privacy and Security Increase Profitability in the Cannabis Industry

Experts estimate that the cannabis industry is currently worth $60 billion, and that number is predicted to grow to $100 billion by 2030. As this industry grows and the customer base gets larger, so too does the need for modern data custody technologies. It might not be obvious at first glance, but data custody and security are critical components of running a successful cannabis business. Here are four reasons why.

The Importance of Data Security in the Cannabis Industry

First, medical dispensaries could be considered “healthcare providers” under the Health Insurance Portability and Accountability Act (HIPAA). Under HIPAA, healthcare providers must implement safeguards to prevent the incidental disclosure of any patient’s “protected health information.” Disclosures could result in a fine of up to $50,000 per disclosure. 

Second, each cannabis company has numerous trade secrets to protect. These could include growing processes, distribution plans, recipes for edibles, extraction techniques, soil mixtures, etc. The theft of any of these trade secrets could be disastrous to a company.

Third, cannabis companies must comply with (sometimes conflicting) state laws. For example, in California, the Medicinal and Adult-Use Cannabis Regulation and Safety Act (MAUCRSA) requires cannabis delivery companies to maintain records of every person who receives a delivery. At the same time, the California Consumer Privacy Act (CCPA) gives customers the right to demand that companies delete any records pertaining to them.

Fourth, data breaches result in damage to a company’s reputation. Dispensaries often sell T-shirts and other merchandise stamped with the company logo to foster customer loyalty, but a newsworthy data breach could shake that loyalty. Further, data breaches could damage the industry’s image as a whole and become a roadblock to legalization efforts at the federal level.

Room for Improvement

Last year, a group of ethical “white hat” hackers located a breach in the THSuite point-of-sale system, which is used by many dispensaries. Through the breach in THSuite, the hackers were able to access roughly 85,000 unencrypted files containing the personally identifying information of 30,000 people, including names, phone numbers, addresses, emails, birthdays, images of state-issued IDs, signatures, quantities of cannabis purchased, and medical ID numbers. 

This breach, and all the reasons discussed above, highlight the need for modern technological solutions. The International Cannabis Bar Association (INCBA) and AXEL are working together to bring these solutions to Bar members. INCBA members will now receive a 20% discount when they sign up for Premium or Business Plan subscriptions of AXEL Go. AXEL Go is the safest way to collect, store and share files during in-office, hybrid and remote work situations.

AXEL’s patented blockchain technology and AES-256 encryption help attorneys collect, store, and share client files in a user-friendly manner that is impervious to hackers, unauthorized access, and ransomware attacks. The decentralized nature of the network ensures that there is no single point of failure. Further, files uploaded to the AXEL network are heavily encrypted, sharded, and scattered between 400+ different global servers, providing a high level of security without sacrificing speed. Sensitive files and shifting regulatory frameworks in the cannabis industry call for an abundance of caution permitted by AXEL Go. INCBA members can sign up for a 14-day trial of AXEL Go and redeem discounts here.

Filed Under: Cybersecurity, Legal Tagged With: cybersecurity, data privacy, law, lawyer

March 22, 2021

The Ethical Responsibility for Data Security in Finance, Law, and Healthcare

It’s difficult to argue that the vast majority of businesses today don’t have an ethical responsibility to adequately protect and secure their customers’ data. However, it’s an even more crucial aspect for organizations with known fiduciary duties to their clients or consumers, such as those in the Finance, Legal, Healthcare, and Insurance sectors. Let’s dig into each of these industries in the United States, look at their unique ethical demands regarding data security, and find some common solutions.

Finance

The financial industry includes banks, investment firms, real estate companies, and insurance organizations. According to the International Monetary Fund, it is the sector targeted most by hackers[1]. It makes sense. In a 2020 survey by Verizon Communications, researchers found that 86% of data breaches are primarily for money[2]. Who has more money than the financial industry?

Hackers target these institutions in a variety of ways. One of their most common tactics is attempting to gain access to customer login info. Direct attacks against an organization’s reserves gain immediate attention and mitigation, but hackers can take over a user account and move around smaller sums for much longer periods.

Another method they use is stealing sensitive financial documents. It provides the malicious agents with a treasure trove of confidential data to use for identity theft.

So, what ethical obligation do they have to their clients for securing this data? Since they’re such huge targets, financial institutions tend to employ data protection strategies that are more sophisticated than average. In 2020, the Federal Trade Commission proposed amendments to the Safeguards Rule and the Privacy Rule in the Gramm-Leach-Bliley Act. Under these proposals:

  • Financial institutions would need to safeguard customer data more robustly, such as utilizing encryption for all information.
  • Customers could opt-out of data sharing policies between banks and third-parties.
  • Banks would require employees to pass multi-factor authentication (MFA) to access client data.

The FTC has not ratified these amendments yet, but they would serve as a much-needed update to the current regulatory framework.

Law

Legal professionals now face an even greater risk to their clients’ personal information. Being the processors of strictly confidential information always put large targets on them. But, the COVID-19 pandemic forced many lawyers out of the office and courtroom and into their den. Working from home is the new normal for legal pros, and that means more cybersecurity risks. Whereas they probably worked in a closed system at the office that IT experts monitored daily, it’s much more challenging to evaluate weaknesses in everyone’s home networks. Coupled with the fact that lawyers, on the whole, aren’t the most technically literate people in the world, and you’ve got a recipe for data breaches.

The American Bar Association gives broad ethical expectations for data security throughout its Model Rules of Professional Conduct[3]. A recent formal opinion published by the organization outlines them in greater detail[4], specifically for those engaged in a virtual practice. This opinion has the following provisions:

  • Lawyers must make “reasonable efforts to prevent inadvertent or unauthorized access [to client data].” Today, a reasonable attempt goes well beyond attaching a confidential document to an email and sending it off with nothing but the hope that it doesn’t fall into the wrong hands.
  • Virtual practitioners should look into setting up Virtual Private Networks (VPNs), keeping the computer’s operating systems updated so that security patches stay current, utilizing file encryption, using MFA, setting strong passwords, and changing them regularly.
  • Legal professionals must vet software and hardware providers to ensure proper security.
  • Lawyers should never use smart speakers (Alexa, Google Home, etc.) or virtual assistants (Siri) when conducting confidential business. These “helpers” listen to every word that is said and can be hacked easily by malicious agents.

Hopefully, The ABA codifies the recommendations given in this opinion into its formal standards.

Healthcare

The medical industry also deals with extremely private, confidential information and is susceptible to drawing attention from hackers. 2020 was an especially bad year for this, as the rise of COVID-19 caused a 55% spike in data breaches compared to 2019[5]. It’s a chilling reminds of how opportunistic threat actors can be. Sensing healthcare providers were stretched to the max and short on resources, they attacked.

Common reasons to target the healthcare industry include stealing patient medical records for resale on the Dark Web, identity theft purposes, or extortion schemes, and ransomware attacks to cripple critical systems until the organizations pay a hefty fee.

The United States Department of Health and Human Services set national regulations about healthcare data security through the HIPAA Security Rule. Here are some of the guidelines:

  • Organizations must have physical and technical security measures enacted for hosting sensitive health data. Examples include facility access limits, computer access controls, and strict limitations on attempts to transfer, remove, or delete patient records.
  • Technical systems must have automatic log-off settings, file encryption capabilities, regular audit reporting, and detailed tracking logs of user activity.

With COVID cases declining and vaccinations increasing, the healthcare sector could soon return to normal and start allocating more cybersecurity resources. At least for the first time in over a year, there’s cause for optimism.

Conclusion

With cyberattacks on the rise, there’s still much room for improvement in these industries. Organizations should go above and beyond legal requirements if adequate cybersecurity is a priority. Combining the right technical solutions with a plan of ongoing education is crucial. Usually, the weakest links in a network are the employees themselves. Train them regularly on the basics of phishing techniques and how to spot them. You’ll have a more resilient workforce who won’t fall for common scams that can put your organization at serious risk.

AXEL Go

Part of the equation is still using suitable technical systems. If your company transfers or stores confidential data, you need to ensure it’s locked down. AXEL Go is a decentralized, private and secure file-sharing and storage platform. It offers industry-leading security features that set it apart from the typical Big Tech applications. It uses blockchain technology, advanced file sharding, the InterPlanetary File System, and military-grade encryption to keep important documents away from hackers. Try AXEL Go and gain access to all of its premium features for only $9.99/mo. It’s the safest way to share and store online.

 

[1] Jennifer Elliott and Nigel Jenkinson, “Cyber Risk is the New Threat to Financial Stability”, IMF.org, Dec. 7, 2020, https://blogs.imf.org/2020/12/07/cyber-risk-is-the-new-threat-to-financial-stability/

[2] “2020 Data Breach Investigations Report”, Verizon, May. 19, 2020, https://enterprise.verizon.com/resources/reports/dbir/?CMP=OOH_SMB_OTH_22222_MC_20200501_NA_NM20200079_00001

[3] American Bar Association, “Model Rules of Professional Conduct”, Americanbar.org, https://www.americanbar.org/groups/professional_responsibility/publications/model_rules_of_professional_conduct/model_rules_of_professional_conduct_table_of_contents/

[4] American Bar Association Standing Committee On Ethics And Professional Conduct, Formal Opinion 489, Americanbar.org, March 10, 2021, https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/aba-formal-opinion-498.pdf

[5] “Healthcare Breach Report 2021: Hacking and IT Incidents on the Rise”, Bitglass, Feb. 17, 2021, https://pages.bitglass.com/rs/418-ZAL-815/images/CDFY21Q1HealthcareBreachReport2021.pdf?aliId=eyJpIjoiOE54NGRRTkhCZDY3aUxGMiIsInQiOiJ0RTZ1QVZXbnFPUGRhZXhVbmhyMmVnPT0ifQ%253D%253D

  • Share on Twitter Share on Twitter
  • Share on Facebook Share on Facebook

Filed Under: Cybersecurity Tagged With: cybersecurity, data protection, data security, finance, healthcare, law

August 21, 2019

Why Data Breaches are so Damaging and how the Law has Failed Consumers

Very few times in history have a group of people sat down with the purpose of writing a set of new laws to improve society. Instead, what usually happens is that laws are written to solve specific problems. This leads to a litany of laws piling up over the decades. While it could always be debated how effective a particular law might be at accomplishing its goal, the rapid pace of technological advancement over the past 20 years – especially as compared to the pace of the lawmaking process – has introduced new challenges as laws become quickly outdated, sometimes even by the time they take effect.

The results of this are acutely apparent in the cross-section between the fields of cybersecurity and consumer protection, namely data breaches.

The magnanimity of consumer protection laws in the United States were written for a society concerned with immediate product safety and compensation for resulting injuries, not for the nebulous and incalculable injuries that may be sustained by potential millions when private records are exposed.

Why are data breaches so damaging?

The unique problem of data breaches stems from the fact that the breach of privacy carries in of itself no specific harm. Instead, it is the later misuse of information that has been breached that may lead to ensuing harm. However, with data breaches occurring on a near-daily basis, the causality of specific financial or reputational damage is nigh impossible to link to a single breach causally; with our laws written around the concept of calculable damages being the source of justified remuneration, we are left constantly and increasingly victimized but unable to seek just compensation.

Some would argue that even more problematic is the irreparable nature of many of the most severe data breaches. Once a name and social security number are leaked, that identity is permanently and irreversibly at risk for being used fraudulently. While one could always apply for a new social security number, the Social Security Administration is extremely reluctant to issue new identities, and while that is a debate for another time, it goes to show just how difficult it can be to recover from a breach. Victims are permanently marred and at increased risk for future injuries resulting from a single breach, no matter how much time has passed.

Because of the damage resulting from a data breach being so far removed temporally and causally from the actual breach itself, adequate compensation is rarely won, if it is even sought. Was it the Equifax breach, the MoviePass breach, or one of the innumerable other breaches this year that resulted in your identity being stolen and used to take out fraudulent loans a decade from now?

Moreover, even if you should find that it was MoviePass’ negligence that leads to your identity being stolen, what compensation can you seek from a company that has been defunct for years? Our laws were not written to address these issues adequately. Our legal system often does not ponder questions of uncertainty and possibility, and that’s the perfect summary of what victims face in the aftermath of a breach; uncertainty and possibilities.

For all the uncertainty victims face, the solutions going forward as a country are equally opaque.

It would be easy to write some draconian law to punish companies for exposing private data, but as is often the case, that could have unintended consequences, such as pushing data overseas where even looser security and weaker privacy laws may exacerbate the problem. Instead, it’s going to take a significant shift in our collective-consciousness over how data is handled.

Laws written for managing telecommunications and transmissions in that era are being used to handle complex cybersecurity and data privacy cases.

This can’t come just from one party though; companies need to seriously consider what data they need to collect, and what information needs to be retained on a long-term basis. Consumers have to take ownership of their data and demand a higher quality of service from corporations and governments over how their data is collected and used.

As a whole, we must recognize the value of data, and the dangers we expose ourselves to by collecting it (and why it might even be best to not collect data at all in many circumstances).

Just like holding valuables such as gold and art entails a security risk, so too does data. If people started treating data like the digital gold it really is, maybe then we could all come together to work out a solution.

But until then, I’ll be keeping my data to myself.

Filed Under: Culture, Cybersecurity, Legal Tagged With: data, data breach, data breaches, data collecting, data collection, data custody, data mining, data privacy, data protection, data security, law, lawyer, legal, legal tech, online privacy, Privacy, private

July 3, 2018

California Thinks It’s Fixing Data Privacy. It’s Not.

“Your move,” says the new California Consumer Privacy Act of 2018.

Except, this isn’t a game of chess—picture it more like a million-piece jigsaw puzzle called “Cats Around the World,” and it’s been spread out on your dining room table for the past twenty years and you’re only 40 pieces in.

(Sounds like a party, am I right?)

Here’s the thing: the data privacy law that was signed on Thursday by California’s Gov. Jerry Brown is a new piece of the data privacy jigsaw puzzle that has served as the U.S.’s means to protect its citizens’ privacy. It’s certainly a huge step in terms of improved privacy laws, but it’s not quite clear how it fits into the nation’s “big picture.”

So far, the U.S.’s privacy law game is patchwork and somewhat messy. We have federal laws like The Federal Trade Commission Act (FTC Act), the Health Insurance Portability and Accountability Act (HIPAA), and the Children’s Online Privacy Protection Act (COPPA), which are aimed at specific sectors, and we also have state statutes that are aimed at the rights of individual consumers. However, there is no single principal data protection legislation, which means the currently enacted laws don’t always work together cohesively.

And this adds to one big, confusing jigsaw puzzle with pieces that sometimes overlap and contradict one another.  

Up until now the timeline of such regulations have been slow and piecework. Most of our states are weak in terms of their data protection, with a few states—Florida and Massachusetts, for example—serving as “leaders” in data privacy regulations.

Already this year we’ve seen the EU’s General Data Protection Regulation (GDPR) going into effect, and we’ve also seen (way too many) data breaches in the states. The issue of data privacy is gaining notice throughout our nation and throughout the rest of the world, and now some of us are wondering: what does the future hold in terms of data privacy in the U.S.?

California’s sweeping law seems to be a good step in the right direction, but how does it fit into the rest of the puzzle?

An “Interesting” Piece, To Say The Least

California’s new privacy law will give consumers more control over their data and force data-holding companies to become more accountable and transparent.  The Act establishes the right of California residents to know what personal information about them is being collected and to whom it is being sold, plus the ability to access that information and delete it. Additionally, the Act will establish an opt-in consent for individuals under the age of 16.

It’s coming into effect in the wake of the new EU law that was enforced in May, and although it isn’t as extensive as the GDPR, it’s certainly proving to be a forerunner of U.S. privacy rights. 

However, the Act also had an interesting path—surprisingly, it didn’t face much opposition from major companies despite its fleshed out regulations.

Why not?

Because there was also a ballot measure—the California Consumer Personal Information Disclosure and Sale Initiative—that had been cleared for a vote in California in the fall, which would have proved to be an even greater challenge for companies due to its tighter restrictions and higher fines.

Major companies—like Facebook, Verizon, Uber, and Google, among others—were already lining up against the ballot, and some donated to the Committee to Protect California Jobs in a further effort to oppose it.

Leaders of the Committee to Protect California Jobs said in a statement, “This ballot measure disconnects California. It is unworkable, requiring the Internet and businesses in California to operate differently than the rest of the world…”

In the end, even though enough signatures were collected for the initiative to appear on the ballot, a compromise was reached instead. This resulted in the proponents withdrawing the initiative and the newly approved Consumer Privacy Act entering the world.

So, to sum up the story, the end result basically came about from many of the voters having to choose between “I don’t like this” or “I really don’t like this.”

…Which kind of sounds like the debate you’d have while shopping for the top two hardest bingo games at the store because it’s your great aunt’s birthday and she wants to party.

The “Puzzle” Thus Far: A Quick Data Privacy Timeline

The California Consumer Privacy Act arrives as a new and shiny addition to a slow and dusty timeline of U.S. privacy regulations.

Let’s take a quick peek at a timeline of some of our nation’s data protection laws:

1974 – Family Educational Rights and Privacy Act: restricts disclosure of educational records

1978 – The Right to Financial Privacy Act: restricts disclosure to the government of financial records of banks and financial institutions

1986 – Computer Fraud and Abuse Act: prohibits unauthorized access to obtaining financial information, causing damage, obtaining something of value, or affecting medical records

1986 – Electronic Communications Privacy Act: protects electronic communications during production, transit, and storage, and applies to email, telephone conversations, and data stored electronically

1988 – Video Privacy Protection Act: prohibits videotape sale and rental companies from disclosing data

1994 – Driver’s Privacy Protection Act: restricts states from disclosing state drivers’ license and motor vehicle records

2000 – The Children’s Online Privacy Protection Act: restricts collection of data from children under the age of 13

2003 – Health Insurance Portability and Accountability Act: protects and establishes standards for the electronic exchange and security of health information

Because the U.S. takes a sectoral approach to regulating privacy, many of the current regulations overlap in some areas while providing gaps in other areas.

For example, the Family Educational Rights and Privacy Act (FERPA) generally covers data like student immunization and medical records, but it sometimes conflicts with COPPA, which only protects data for children under the age of 13.

With ever-growing sources of sensitive and valuable data, and the increasing risk of that data being mishandled and exposed, a need for solid privacy regulations is bigger than ever.

But with a sectoral approach to regulations, the result is that maintaining standards of data privacy becomes a confusing and complicated task.

The Big Picture (Hopefully Not Of Cats)

There was a time when the sectoral approach was deemed by many U.S. organizations to be preferable to a more overarching approach like the GDPR: industries could establish a more “individualized” way of regulation that suited their needs, and the hodgepodge of regulations sometimes created gaps that organizations could fall into.

However, now the gaps are smaller and the replacing overlaps make it significantly more difficult and complicated for organizations to appropriately handle their data. The U.S. is still an outlier in its privacy approach, but now it’s starting to get a really bad rap across the globe.

The new California Consumer Privacy Act of 2018 is one more piece to add to the immense jigsaw puzzle that makes up the U.S.’s approach to privacy laws, but it begs important questions: how well will it fit in with already existing regulations, and how much of an influence will it have in future regulations being established?

Ideally, the nation’s future of data privacy laws will be cohesive, clean, and fit together well in a way that thoroughly protects citizens’ data and is adaptable to numerous industries.

California has made a big step towards the future of data privacy—here’s to hoping that only good things will follow.

Filed Under: Cybersecurity Tagged With: act, california, california consumer privacy act, data mining, data privacy, law, legislation, Privacy, Security, statute

Primary Sidebar

Recent Posts

  • AXEL News Update
  • AXEL Events
  • Biggest Hacks of 2022 (Part 2)
  • Biggest Hacks of 2022 (Part 1)
  • The State of Government Cybersecurity 2022

Recent Comments

  • Anonymous on Five Simple Security Tricks

Footer

Sitemap
© Copyright 2024 Axel ®. All Rights Reserved.
Terms & Policies
  • Telegram
  • Facebook
  • Twitter
  • YouTube
  • Reddit
  • LinkedIn
  • Instagram
  • Discord
  • GitHub