AXEL Network Products:

AXEL GO - share and store files securely.

LetMeSee - photo sharing app.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

AXEL.org

  • Network
  • Technology
  • Applications
  • Blog
  • About
    • Team
    • Press
    • Careers
    • Patents
  • Contact Us
  • Login
    • AXEL Go
    • AXEL Cloud

Cybersecurity

August 23, 2018

Read This Before Downloading That New App

Last year, the total number of mobile app downloads worldwide was calculated to be 178.1 billion.

And that number is only expected to go up this year, as more and more apps continue to show up on the market and draw our attention.

In fact, with over 5.8 million apps available to download today, you’ve probably had a lot of conversations about that amazing thing you can do on your phone because of a new app.

But have you discussed the safety of those apps you’ve been downloading, and whether or not the data on your phone is still secure?

“Using Apps Safely” might sound like a boring topic—I mean, come on, who cares about that when they’re busy taking a quiz to find out which Disney princess they are—but it’s extremely important for every user to be aware of and informed about the potential dangers of some of the apps on today’s market.

Every new app should pass certain criteria before being downloaded. And there is a huge reason why.

Read This Before Downloading That New App

Apps Cultivate Data

App safety isn’t exactly a new discussion topic, but it’s one that isn’t always taken seriously. Today’s apps are new and exciting and full of promises. You can do practically anything with one—from important things like locking your front door…

…to really important things like proving you’re a true Game of Thrones fan with a Hodor keyboard (really).

But with every app you use, it cultivates more data.

What’s more, mobile marketing is making a bigger appearance because businesses are fully aware of the monetary potential that apps now carry. And this means that the data we cultivate while using our various apps is becoming more and more desirable.

How much data do we cultivate while using apps?

Think about it: We live with our phones connected to our hands; we communicate with friends and coworkers, we answer emails, we track our health, we calculate our caloric input, we shop for clothes, we keep tabs on our bank accounts… we even let our devices memorize our faces.

Just last year, Statista calculated that app users spent 77% of their valuable time on their three most-frequented apps.

Read This Before Downloading That New App

That’s a lot of time spent on apps, and a lot of data created while using them. For marketers, it means a gold mine of monetary potential.

Read This Before Downloading That New App

Using Apps = Making Money

As we open our various apps, make in-app purchases, and tap on one link after another, some companies are tracking our behavior because it gives them a better picture of who we are and what motivates us to click “buy.”

This is why we have to pay attention to the integrity of every app we download. Some companies are sneaky about the data they collect and how they handle the data that they collect. It’s valuable stuff, and there’s a lot of it, so they’ve figured out an easy way to get what they want without you catching on—which is through their apps.

And that, of course, means our data privacy concerns need to extend into the world of apps.

So if we know the potential danger of downloading an untrustworthy app, then why are so many everyday users careless about which ones they download?

I mean, you wouldn’t let just anybody into your house to rifle through your closet, read your mail, browse through your personal journals, and then use that information to make money, right? So why would you allow an app to essentially do the same thing to the data on your phone?

The answer to that is this: the ease and excitement of downloading a new app far outweigh any potential threats that the app might pose.

Because of this, many of us tap the download button without giving a second thought to the app’s safety and then suffer the consequences of having downloaded a “Trojan” app—one that hides a brutal invasion.

Suddenly, we go from operating our phone to holding a data-laden device in our hands that’s being operated by hackers.

But here’s the good news: you can learn to spot a potentially malicious app before it harms you.

And you can feel more confident about the safety of your apps by checking certain things before tapping that download button. It’s easy to enjoy the benefits of some of those amazing apps out there if we just learn how to perceive whether an app is safe or not.

So, before you download anything new, make sure to run that app past a few safety checkpoints to ensure that it upholds data safety practices.

It’s easy to enjoy the benefits of some of those amazing apps out there if we just learn how to perceive whether an app is safe or not.

Read This Before Downloading That New App

4 Checkpoints An App Should Pass Before Downloading

Imagine it’s a Sunday afternoon, it’s raining, everyone you know is too busy for you, and even your dog doesn’t want to look at you. You’re bored—and you want to download that cool new app and figure out all the incredible stuff it does.

If you’re bored out of your mind, you might be tempted to throw caution to the wind and hit “download” without a second thought.

But before you do that, remember that you like your data better when it’s not being exploited—so take a few minutes and double-check to see if that new app can pass these 4 safety checkpoints.

Checkpoint One: The Integrity of the App’s Marketplace

Where is that app coming from? The best route to take when downloading an app is to start from a reputable market source. Read through their privacy policies and whether or not they hold their developers accountable to their strict policies (for example, here are Apple’s developer guidelines and Google’s policy for developers). Reputable marketplaces will have strict privacy policies and guidelines and have a history of expelling violators.

Checkpoint Two: The Reviews

Read the reviews. Are the ratings high, or at least reasonable? Did any reviewers mention that they downloaded the app and were invaded by malware? Or, does every single review seem positive and fake? Some app developers will hire people to leave fake reviews in order to boost their ratings. Take some time to read through a good mix of the app’s reviews and evaluate whether it seems safe or not.

Checkpoint Three:  The Company

Does the company that created the app seem safe and reputable, or does it seem questionable? Go to the company’s website and read about their history, maybe find out about their team, and see if they are a legitimate company and not some clueless app tinkerer trying to throw bad apps into the mix. Trustworthy companies aren’t going to risk their business by putting out a nasty app.

Checkpoint Four: The Privacy Policy

Before ever allowing an app to take up space on your device, take the time (I know it doesn’t sound fun, but trust me, it’s worth it) to read the company’s privacy policy in order to learn exactly WHAT information they plan on acquiring and HOW they plan to use that information.

A lot of untrustworthy apps have questionable policies that fly under the radar because most people don’t want to bother with taking the time to read through its technical lingo. Don’t let this tactic get you—read through the policy and find out whether that app will be accessing data and selling it to third parties or using it in other ways for monetary gain.

Essentially, any new app you’re checking out should come packaged with a privacy policy that you can trust your data with and that is clear and honest about its intentions.

(In fact, if you want to see an example of a solid policy right now, check out the AXEL privacy policy. We’re kind of proud of it.)

Read This Before Downloading That New App

Happy App-ing

There are plenty of bad apps out there that you will want to avoid, but there are also plenty of really awesome apps out there that might actually transform the way you do things in the best possible way.

It’s up to you to be aware of the benefits and dangers of today’s apps and to assess whether the one you’re about to download will protect your private data or put it at risk.

And remember: although there are some app developers out there who want to hack your data with their invasive app, there are also a large number of trustworthy developers out there who know how to combine innovative tools with strong privacy protection.

So don’t worry—you can have fun and do amazing things on your phone while also protecting your data.

Filed Under: Cybersecurity Tagged With: app, apps, cybersecurity, data privacy, information security, online privacy, online security, Privacy, safety, Security

August 21, 2018

The Hidden Danger of Virtual Worlds

On a summer afternoon, a number of Microsoft employees were invited to attend a training seminar.

But, instead of grabbing a pen and heading to the boardroom, they plugged themselves into a set of headphones and fired up Second Life.

This online social “game” was huge for a number of years in the early 00s, mainly because it offered average, everyday citizens an escape from the monotony of real life. Through a digitized landscape, users could create new “lives” that were as hedonistic as they chose.

For Microsoft employees, the pixelated replica of the Microsoft building was the location of their training seminar. But it wasn’t just Microsoft that jumped on the bandwagon – big-name rock stars lined up to perform virtual gigs and real-life travel companies sent correspondents into the melee to report on the latest developments.

For all intents and purposes, Second Life was real life – except you could enjoy it from the comfort of your own home.

The “game” (a term which should be used loosely in this context because, well, there’s actually no way to win at Second Life) was inspired by Snow Crash, the 1992 novel by Neal Stephenson. In the book, citizens navigate around a digital world created and run by independent entrepreneurs – a concept that’s becoming more and more real by the day.

The purpose of Second Life isn’t to gather as many gold coins as possible or figure out a mission set by a wiry old wizard. Instead, it is simply a digital escapist fantasy that allows users to be whoever they want and do whatever they want away from the restrictions of the real world.

While the possibilities were (and still are) endless in Second Life, one phenomenon was quick to surface; that normal people submersing themselves in the game were acting pretty much the same as they would in real life. This made it a fascinating environment to study the social behaviors of people in a pre-built stage.

Sure, stories emerged of people having affairs on Second Life that affected real-world marriages but, for the most part, people used it to escape reality and… do pretty much the same as they were doing in their real lives.

What is the Metaverse?

Let’s backtrack for a minute.

The Metaverse is a term that dates back to Stephenson’s sci-fi novel. It was the name given to the virtual world in which the characters interacted and lived, and it’s now the term being given to a blockchain project that essentially aims to replicate the real world in a digitized format.

In Snow Crash, “players” moved around as Avatars while the central strip – known as “the Street” – could be built on by developers, creating an even more entangled version of reality.

The goal of the Metaverse project is to build an entire universe where digital assets and digital identities are the basis of transactions to create a new kind of ecosystem that has the potential to completely change human society.

Even back in 1992, Stephenson had an insightful eye into what the future might hold for humanity. Today, our lives resemble those of the characters in the book – our work and lives are becoming more and more digitized, with people spending more time online than offline.

The way we communicate has undergone a complete transformation, where we now send clipped messages via the internet rather than having to face talking to real people. Soon, we might see even more transfers – both human and asset based – taking place on the blockchain which will shift the entire economic world.

It can be a hard pill to swallow, but some might argue we’re already halfway there. Enter the New Reality.

With people increasingly living their lives out online, there’s one big elephant in the room that keeps bubbling away below the surface – data privacy.

The Metaverse and its Effect on Data Privacy

In the real world, we don’t have to enter a username and a password to wake up in the morning and, when we pass people on the street, our full names and addresses aren’t typed out in a bubble above our heads.

Online it’s a different story. And, in fact, with the likes of Second Life and social platforms like Twitter and Facebook, users seem to be actively willing to hand over their information to access their feeds.

This raises the question of whether privacy will soon be regarded as an outdated social need or whether it will evolve into something else entirely. At the moment, the rules of the online world are considerably more open and vague than those in the real world, but this might have to change when the Metaverse comes into play.

Why?

Because so far, most virtual reality games and landscapes are built in a “walled garden” format. They run behind corporate firewalls and aren’t interconnected in any way. When you enter one world, you’re essentially caged in and avatars can’t travel between two different digital worlds. In this case, security isn’t necessarily a priority, because data isn’t being transferred from the hands of one corporation to another.

The problem arises when virtual worlds are built on open source software. This means avatars can travel between different virtual landscapes. And, for now, the majority of these platforms are built by developers in their spare time, which means that security is a low priority for them.

Take OpenSimulator, as an example. This software powers over 300 different public worlds and even more private ones, covering an area of 15,000 square kilometers. The software means anyone can set up a virtual world via the Oculus Rift without having to break the bank.

MOSES, one of the worlds built with OpenSimulator, is owned by the US Army, and the problems with security are already doing the rounds. At the moment, it’s difficult to know how to go about addressing data security issues when this new digital landscape is so new (despite its fictional origin in the 90s).

For now, it seems, the Metaverse is an experimental place to dabble in the future of humanity. The fresh excitement of it and the relatively unknown future it holds means security isn’t necessarily a priority for developers.

But soon, when more and more people start venturing into their online lives, we’ll have to sit down and seriously think about what data privacy means in this new landscape, particularly when it comes to things like authentication, content protection, and secure communications.

But, if Second Life is anything to go by, the population of people who are ready and willing to escape reality and immerse themselves in an online parallel universe are more concerned with who they will be there than who will take their information.

Filed Under: Cybersecurity Tagged With: avatar, cybersecurity, data, data protection, metaverse, Privacy, second life, virtual

August 14, 2018

6 textbook examples of how NOT to respond to a Data Breach (Seriously guys?)

Yahoo: Do nothing and pray it goes away

Why are we surprised at this?! When Yahoo suffered a breach in 2013, it decided to just keep quiet about the 3 billion accounts that were compromised. Surely this would prove to be an effective strategy?

LOL.

The news broke a whole FOUR years later, in 2017, that 3 billion accounts had been hacked, which is more than the company claimed in 2016, which is the first time anyone heard anything about a data breach. We shouldn’t really be surprised, as “do nothing and pray it goes away” has been Yahoo’s MO for quite some time now.

FriendFinder Networks: Take days to respond and then downplay the incident in a vague press release

FriendFinder Networks is a company that you’d reeeally want to keep your data secure. It operates AdultFriendFinder, a “sex and swinger community,” and when it suffered a breach in 2016, the response was slow and the press release was tepid. The company affirmed that it “encourages users to change their passwords,” and appeared to put most of the onus on the users, commenting that it would contact users “to provide them with information and guidance on how they can protect themselves.” Seriously?

This press release came after days of speculation, which is actually forever if you are a user of an adult website waiting to find out if your data has been made public.

Equifax: Fail to patch software, take forever to disclose breach, let execs sell their shares

Equifax has one of the shadiest timelines of this group, and competition was stiff here!! After failing to patch a known vulnerability in March 2017 in widely used open source software Apache Struts, the data of 143 million US customers was potentially exposed in May 2017. Then on July 29th, days after the breach was discovered, executives sold off nearly $1.8M worth of Equifax shares. Hmm….this looks bad, but maybe there’s something we don’t know here. (Read: there’s not. It’s bad.)

Ticketmaster: Pretend it’s not happening

Ticketmaster was alerted to a possible breach in April of 2018, but decided to do its best impression of an ostrich and just pretend it wasn’t happening until it received apparently irrefutable (or un-buryable) evidence on June 23rd. Online bank Monzo released a statement shortly afterward saying it spotted the breach in April, but Ticketmaster said nah after an internal investigation revealed no evidence of any such breach.

I’m confused. Are we just letting companies investigate themselves now? This is not how any of this should work. Anywho….

Facebook: Deny deny deny

Facebook didn’t suffer a breach. Instead, it voluntarily gave away a treasure trove of user data and then informed us that we had all agreed to it in the terms and conditions. Whoops – we should have read those, but they’re just so boring, and no one can recall seeing a line item that said “we will give away all your data, suckers, and there’s nothing you can do about it LOL.” I think I would have remembered that…..

To its credit, Facebook did admit that its data had been “improperly shared,” but didn’t go so far as to call it a breach. They didn’t go so far as to call us suckers either, but that doesn’t mean it isn’t true.

Exactis: Leave us all in suspense as if our data’s safety was a plot point in a Mission Impossible movie

None of this is entertaining, you guys. Apparently there is a “database with pretty much every US citizen in it” floating around the internet, according to security experts. That seems pretty bad.

But even worse, the company associated with the breach has stayed silent for days, which is deeply bumming out 230 million of us who would kindly like to know if our personal information is available online.

The bottom line

Data breaches are inevitable. Attackers are targeting companies on a daily basis. But ignoring the fact that a data breach has occurred, failing to patch a known vulnerability, putting the onus of dealing with a breach on users, and – most obviously of all – selling off your stock when you have insider information of a breach doesn’t help anyone. Companies need to be honest when they think a breach has occurred, or they risk losing their customers’ trust. And as our data multiplied exponentially, trust is becoming scarce.

Filed Under: Cybersecurity Tagged With: Breach, cybersecurity, data breach, equifax, facebook, online privacy, Privacy, Security, ticketmaster

July 26, 2018

Why Your Camera Isn’t As Safe As You Think

You’ve joked about it before. How some lonely CIA agent is secretly watching (and perhaps salivating at) your every move via your webcam. So, after you get out of the shower you open your laptop, strike a pose, and chuckle to yourself because you know the very idea is both hilarious and preposterous…until you realize it isn’t.

Webcam spying is very real.

Sure, you’ve seen articles and news segments about people who’ve fallen victim to spying via their webcams. But that’s because they’re either incredibly careless or doing some illegal stuff they know they shouldn’t. Right? Not exactly.

It’s well within the realm of possibility that you’ll wake up tomorrow morning to see pictures and/or videos of yourself in some sort of “compromising position” online. Yes, I said pictures of you. Plain old, beer-drinking, hangover-having you. But of course, you probably won’t see those pics until they’ve been liked, retweeted, and shared with a million other people first.

So in case you missed it my friend, welcome to the 21st century.

Devices That May Be Hacked

In most cases, spying is done through the cameras of desktop or laptop computers. So if you’re thinking of taking that extra five bucks out of your dad’s wallet, don’t assume you’re all alone. A hacked camera can cause severe emotional or psychological damage. One 20-year-old Glasgow student was left traumatized after she found out webcam hackers watched her while she was in the bath.

Although computer webcams are the devices that are most commonly hacked, you can also be tracked and watched via your smartphone camera. Even surveillance systems may be hacked and used to track people in real time. This means unscrupulous individuals may be able to watch you at home or at work from multiple angles, all day, every day.

Even more frightening is the idea that your children may be targeted. Imagine the horror of a mother in Houston who found out that footage of her daughters’ bedroom was being live-streamed. And if you thought things couldn’t get any weirder than that, consider the fact that even baby monitors are being used for spying and the data of more than 2.5 million kids was stolen using their favorite smart toy.

How It’s Done

The most common way hackers access your cameras is by using malware. Seemingly innocent links or attachments embedded in emails and online ads may be riddled with Trojan horses. Be sure to avoid the sweet Russian girls; a simple click or download could leave your device infected—effectively handing over control on a silver platter. Oftentimes, the malicious code is packaged with legitimate programs or software so you don’t even notice it. Hackers with remote access can turn your cameras on and off with no indication from the camera light.

Another way people may gain unpermitted access is by borrowing your device and manually downloading applications that allow them to access your files, camera, and microphone. These applications can be hidden so you don’t even know they are at work.

And if you thought that was all, my friend, you’d be wrong. Ever thought about app permissions? I’m sure you’re familiar with apps asking for permission to use your camera. What you may not be familiar with is the fact that these apps can capture you on camera at any time when they’re in the foreground (yes, that means even when you’re not using the camera). What’s worse is that no one knows what these apps may be able to access when they’re in the background and out of sight.

Unmasking The Creeps Who Spy On You

The main perpetrators in the spying pandemic are hackers. They use Trojans to claim control of your cameras and watch your every move. They may put your photos and videos on the internet for others to view online. In more disturbing cases, nude and intimate moments may be live-streamed on voyeuristic websites.

As I mentioned before, apps can also gain unpermitted access to both your front and rear cameras. Who knows what WhatsApp, Instagram, Snapchat and the like are capturing when your cameras are off and what they’re doing with it? Are they selling footage of you? Maybe. To whom or for what purpose I don’t know, but Snapchat may need the money.

Now, we’ll discuss the attackers you already know about: the government. Did you know that built-in backdoors in your smartphone may allow the government to access your files, read your messages, listen to or record your calls, capture images, and stream video? Just in case you ever thought the government was on your side!

In 2013, Edward Snowden revealed that GCHQ—a British surveillance agency—collected and stored images from the video chats of millions of Yahoo users under the Optic Nerve program. Yes, tons of raunchy pics were collected and stored as well for…uh…security reasons.

But the blatant disregard for your online privacy doesn’t stop there. In fact, your school and the people you know best may be the biggest culprits.

Between 2009 and 2010, a number of Pennsylvania schools were caught remotely accessing the cameras of laptops issued to their students. And as for your “friends,” they can simply install spy software on your device without you having the slightest clue. Just ask pageant girl Cassidy Wolf. She learned the hard way when she was blackmailed with nude photographs her former classmate had taken via her webcam.

Why Cyber Spying Is Wrong

This one is obvious. We all have the right to data privacy. What’s that, you ask? It’s the ability for an individual (or organization) to determine if and how personal data will be shared with third parties. This includes access to the cameras on your laptop, smartphone, and surveillance system. Your data, your choice.

But is it even your data?

That’s a pretty important question. The terms and conditions you’re so quick to agree to (but never really read) may disagree. Are you unknowingly giving apps permission to access your cameras even when you aren’t using them? Maybe. Is this approach grossly unethical and utterly misleading? Yes. Is it illegal? Perhaps not.

What You Can Do About It

If you’re fine with people spying on you, you might as well stop reading right now. If you’d like a few tips on how to deal with the issue, consider those listed here:

  1. Cover your webcam with tape. If you have an external webcam, be sure to unplug it when you’re not using it.
  2. Install anti-virus software on your PC and your smartphone. It will readily spot and block malware. Be sure to keep your firewall enabled as well.
  3. Use protection. No, not that type of protection. Place a secure lock on your phone. Use a fingerprint lock or password to keep nosy “friends” away.
  4. Use your devices on secure networks. Stay away from public networks.
  5. Think carefully before giving an app permission to access your camera.
  6. Update the password for your surveillance system regularly.
  7. Be cautious about the emails that you open and the links or attachments inside them.
  8. Be wary of online advertisements and dodgy chat rooms.

And because I love you, here are some other ways you can be safe online.

The Bottom Line

People are definitely being spied on with their own cameras. You may be one of them. The government, hackers, schools, apps, and people you know may not be as innocent as they seem. While organizations like ours try to bring light to this gross disregard for your right to data privacy, remember to do what you can to keep yourself protected. And for Pete’s sake, never, EVER, trust that shady middle-aged guy who always sits behind you in the coffee shop!

Filed Under: Cybersecurity Tagged With: camera, cybersecurity, data privacy, online privacy, Privacy, Security, spying, webcam

July 19, 2018

Protect Data Privacy by NOT Collecting Data at All

In Hansel and Gretel, the two siblings sprinkle breadcrumbs as they venture into the woods in order to find their way home.

When we browse the internet, we sprinkle metaphorical breadcrumbs of information about ourselves as we go. Unlike the fairytale, where Hansel and Gretel knew what they were doing, the vast majority of internet users are unaware of just how much information they’re giving away on their journey around the web.

Unless you’ve got blockers installed up to your ears, the tracking starts as soon as you open up an internet browser. From that moment, your digital footprints carve a route around the web that can be traced back to you at any moment.

Sites you visit can use these footprints (or breadcrumbs, if we’re sticking with the fairytale theme) to recognize who you are and serve you a more personalized experience.

That sounds great, right?

In one study, 71% of consumers said they’d prefer a personalized experience with ads, while some even expected it from brands. And the easiest way for sites to personalize those experiences is to track the interests and online behaviors of visitors.

From that perspective it works; the consumer gets a personalized experience and brands get to give their customers what they want. It’s a win-win situation.

But is it really that simple?

I mean, we’re not talking epic government data mining expeditions here; we’re simply talking about brands using specific information to better target content to their users. It’s all above board and totally legal.

So what kind of data can these companies get from you?

It can be anything from your current location and the device you’re using to specific links you’re clicking on and the actions you take on certain sites. It all starts with your browser and your IP address – the moment you pop up online, a unique number that identifies the device you’re using is recorded, marking the moment you entered the internet and where you were when you went online.

At the same time, your browser is logged as well as other uniquely identifying information like the system you’re running the browser on, the display resolution, and even the battery level of your device. Even if you haven’t clicked your mouse or typed anything in yet, you’re already being tracked.

Who Benefits from Collecting Data?

I mentioned earlier that data collection can be mutually beneficial. Consumers don’t have to see ads that they’d never buy from in a million years, while websites can get more information on their visitors to make experiences more personalized and, therefore, get more sales.

But who is it really more beneficial for? If we really get down to the bottom of it, who is really getting the most out of the dissemination of data?

Personalized experiences are nice, right? But are they worth the data breaches that happen and the inevitability that brands will sell that data to completely unrelated companies just to make a quick buck?

Let’s face it: most sites are eager to scrape as much information as they can about their visitors with the sole purpose of making more money. Sure, the thought process might be there to make experiences more enjoyable by personalizing them, but really the goal here is to target more.

Look at Facebook. The data it collects as you browse the site can determine when you’re expecting your firstborn, the exact names and addresses of the companies you’ve worked for in the past, and even your political leaning.

And guess what?

It doesn’t just collect this data to get to know you better as if you’re on some kind of weird, digital first date. It collects it to sell to companies to make money through advertising.

So yes, there are benefits to the consumer; you might not have to pick a particular city every time you want to get the weather because it’s remembered your past choices, or you might not have to shop again for those items you left in your online basket last week, but these benefits are minor compared to the massive benefits companies and sites get from tracking your every move.

Where the Lines Get Hazy…

Of course there are browser security protocols in place that mean sites can’t just go around scraping all sorts of stuff about you. In fact, for the most part, sites can only access the data they’ve collected – as in, they can only see the information you’ve “given” them while you’ve been on their site.

However, something called third-party cookies muddy the waters. These aren’t associated with any particular site, but instead get spread across a number of different pages in, say, an ad network.

Princeton University ran a study that found cross-site trackers embedded in 482 of the top 50,000 sites on the web. It might not seem like a lot in the great scheme of things, but once these third-party trackers have consumer information they can then sell it to even more people.

While the most sensitive data is redacted from these apps, consumers are still having to put their trust into a nameless, faceless brand.

But what about the data that consumers are handing over willingly?

Things like Google searches and checking into venues on Facebook?

While sites might be collecting information like which browser you’re using and what your shopping preferences are, you’ve probably handed over more sensitive information like your birth date and exact location without even giving it a second thought.

Does the Future Lie in NO Data Collection?

In May this year, the GDPR (General Data Protection Regulation) came into play in Europe. It means that brands now have to explicitly state to their users exactly what information they are collecting and exactly what they will be doing with it.

Users now have to actively opt-in to providing their information; sites can’t just take it for nothing. Already countries outside of Europe are considering this new method because, well, it just seems like the right thing to do.

But what does it mean for the future of data collection?

Now that users are more aware of their rights when it comes to data collection and have to actively “opt-in” with their information, they are becoming less and less inclined to do so.

If there’s an option to not sell your firstborn, it’s kind of a given that you’re going to go for that, right?

In this instance, the future of data collection looks bleak – especially for sites and brands. If their users aren’t giving up the goods, they’ve got nothing to work with and essentially have to go back to the drawing board.

This might invite new ways of collecting data or a more collaborative approach between consumers and brands so that information can travel between the two in an open and honest way.

The future of data privacy is uncertain for now, especially so soon after GDPR has risen its head. What we do know is that the power will be distributed more evenly between internet users and brands, and sites will no longer be able to take, take, take without building more of a relationship with their visitors.

It sounds quite nice, actually.

But would a world without any data tracking or collection be good? If every person who went online immediately went incognito, leaving not a single trace of who they are or what they’re doing, how would the digital world evolve? How would companies know what their consumers want? How would internet users cope with having to start from scratch every time they went back online?

The questions remain endless, but it’ll be interesting to see which path data collection goes down from here on out.

Filed Under: Cybersecurity Tagged With: cookies, cybersecurity, data privacy, data protection, infosec, Privacy, Security, tracking

July 3, 2018

California Thinks It’s Fixing Data Privacy. It’s Not.

“Your move,” says the new California Consumer Privacy Act of 2018.

Except, this isn’t a game of chess—picture it more like a million-piece jigsaw puzzle called “Cats Around the World,” and it’s been spread out on your dining room table for the past twenty years and you’re only 40 pieces in.

(Sounds like a party, am I right?)

Here’s the thing: the data privacy law that was signed on Thursday by California’s Gov. Jerry Brown is a new piece of the data privacy jigsaw puzzle that has served as the U.S.’s means to protect its citizens’ privacy. It’s certainly a huge step in terms of improved privacy laws, but it’s not quite clear how it fits into the nation’s “big picture.”

So far, the U.S.’s privacy law game is patchwork and somewhat messy. We have federal laws like The Federal Trade Commission Act (FTC Act), the Health Insurance Portability and Accountability Act (HIPAA), and the Children’s Online Privacy Protection Act (COPPA), which are aimed at specific sectors, and we also have state statutes that are aimed at the rights of individual consumers. However, there is no single principal data protection legislation, which means the currently enacted laws don’t always work together cohesively.

And this adds to one big, confusing jigsaw puzzle with pieces that sometimes overlap and contradict one another.  

Up until now the timeline of such regulations have been slow and piecework. Most of our states are weak in terms of their data protection, with a few states—Florida and Massachusetts, for example—serving as “leaders” in data privacy regulations.

Already this year we’ve seen the EU’s General Data Protection Regulation (GDPR) going into effect, and we’ve also seen (way too many) data breaches in the states. The issue of data privacy is gaining notice throughout our nation and throughout the rest of the world, and now some of us are wondering: what does the future hold in terms of data privacy in the U.S.?

California’s sweeping law seems to be a good step in the right direction, but how does it fit into the rest of the puzzle?

An “Interesting” Piece, To Say The Least

California’s new privacy law will give consumers more control over their data and force data-holding companies to become more accountable and transparent.  The Act establishes the right of California residents to know what personal information about them is being collected and to whom it is being sold, plus the ability to access that information and delete it. Additionally, the Act will establish an opt-in consent for individuals under the age of 16.

It’s coming into effect in the wake of the new EU law that was enforced in May, and although it isn’t as extensive as the GDPR, it’s certainly proving to be a forerunner of U.S. privacy rights. 

However, the Act also had an interesting path—surprisingly, it didn’t face much opposition from major companies despite its fleshed out regulations.

Why not?

Because there was also a ballot measure—the California Consumer Personal Information Disclosure and Sale Initiative—that had been cleared for a vote in California in the fall, which would have proved to be an even greater challenge for companies due to its tighter restrictions and higher fines.

Major companies—like Facebook, Verizon, Uber, and Google, among others—were already lining up against the ballot, and some donated to the Committee to Protect California Jobs in a further effort to oppose it.

Leaders of the Committee to Protect California Jobs said in a statement, “This ballot measure disconnects California. It is unworkable, requiring the Internet and businesses in California to operate differently than the rest of the world…”

In the end, even though enough signatures were collected for the initiative to appear on the ballot, a compromise was reached instead. This resulted in the proponents withdrawing the initiative and the newly approved Consumer Privacy Act entering the world.

So, to sum up the story, the end result basically came about from many of the voters having to choose between “I don’t like this” or “I really don’t like this.”

…Which kind of sounds like the debate you’d have while shopping for the top two hardest bingo games at the store because it’s your great aunt’s birthday and she wants to party.

The “Puzzle” Thus Far: A Quick Data Privacy Timeline

The California Consumer Privacy Act arrives as a new and shiny addition to a slow and dusty timeline of U.S. privacy regulations.

Let’s take a quick peek at a timeline of some of our nation’s data protection laws:

1974 – Family Educational Rights and Privacy Act: restricts disclosure of educational records

1978 – The Right to Financial Privacy Act: restricts disclosure to the government of financial records of banks and financial institutions

1986 – Computer Fraud and Abuse Act: prohibits unauthorized access to obtaining financial information, causing damage, obtaining something of value, or affecting medical records

1986 – Electronic Communications Privacy Act: protects electronic communications during production, transit, and storage, and applies to email, telephone conversations, and data stored electronically

1988 – Video Privacy Protection Act: prohibits videotape sale and rental companies from disclosing data

1994 – Driver’s Privacy Protection Act: restricts states from disclosing state drivers’ license and motor vehicle records

2000 – The Children’s Online Privacy Protection Act: restricts collection of data from children under the age of 13

2003 – Health Insurance Portability and Accountability Act: protects and establishes standards for the electronic exchange and security of health information

Because the U.S. takes a sectoral approach to regulating privacy, many of the current regulations overlap in some areas while providing gaps in other areas.

For example, the Family Educational Rights and Privacy Act (FERPA) generally covers data like student immunization and medical records, but it sometimes conflicts with COPPA, which only protects data for children under the age of 13.

With ever-growing sources of sensitive and valuable data, and the increasing risk of that data being mishandled and exposed, a need for solid privacy regulations is bigger than ever.

But with a sectoral approach to regulations, the result is that maintaining standards of data privacy becomes a confusing and complicated task.

The Big Picture (Hopefully Not Of Cats)

There was a time when the sectoral approach was deemed by many U.S. organizations to be preferable to a more overarching approach like the GDPR: industries could establish a more “individualized” way of regulation that suited their needs, and the hodgepodge of regulations sometimes created gaps that organizations could fall into.

However, now the gaps are smaller and the replacing overlaps make it significantly more difficult and complicated for organizations to appropriately handle their data. The U.S. is still an outlier in its privacy approach, but now it’s starting to get a really bad rap across the globe.

The new California Consumer Privacy Act of 2018 is one more piece to add to the immense jigsaw puzzle that makes up the U.S.’s approach to privacy laws, but it begs important questions: how well will it fit in with already existing regulations, and how much of an influence will it have in future regulations being established?

Ideally, the nation’s future of data privacy laws will be cohesive, clean, and fit together well in a way that thoroughly protects citizens’ data and is adaptable to numerous industries.

California has made a big step towards the future of data privacy—here’s to hoping that only good things will follow.

Filed Under: Cybersecurity Tagged With: act, california, california consumer privacy act, data mining, data privacy, law, legislation, Privacy, Security, statute

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 9
  • Page 10
  • Page 11
  • Page 12
  • Page 13
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • AXEL News Update
  • AXEL Events
  • Biggest Hacks of 2022 (Part 2)
  • Biggest Hacks of 2022 (Part 1)
  • The State of Government Cybersecurity 2022

Recent Comments

  • Anonymous on Five Simple Security Tricks

Footer

Sitemap
© Copyright 2024 Axel ®. All Rights Reserved.
Terms & Policies
  • Telegram
  • Facebook
  • Twitter
  • YouTube
  • Reddit
  • LinkedIn
  • Instagram
  • Discord
  • GitHub