IT Security Professional
3408 stories
·
24 followers

California Judge Rules Feds Can't Force You to Unlock Your iPhone with Your Finger or Face

1 Share

On January 14th, Forbes reported that a California judge had ruled that American cops can’t force people to unlock a mobile phone with their face or finger. The ruling goes further to protect people’s private lives from government searches than any previous ruling and is being hailed as a potentially landmark decision.

Previously, U.S. judges had ruled that police were allowed to force unlock devices like Apple’s iPhone with biometrics, such as fingerprints, faces or irises. That was despite the fact feds weren’t permitted to force a suspect to divulge a passcode. This new ruling suggests that all logins are equal.

The order came from the U.S. District Court for the Northern District of California in the denial of a search warrant for an unspecified property in Oakland. The warrant was filed as part of an investigation into a Facebook extortion crime, in which a victim was asked to pay up or have an “embarrassing” video of them publicly released. The cops had some suspects in mind and wanted to raid their property. In doing so, the feds also wanted to open up any phone on the premises via facial recognition, a fingerprint or an iris.

While the judge agreed that investigators had shown probable cause to search the property, they didn’t have the right to open all devices inside by forcing unlocks with biometric features. Magistrate Judge Kandis Westmore ruled the request was “overbroad” as it was “neither limited to a particular person nor a particular device.”

In a more significant part of the ruling, Judge Westmore said that the government did not have the right, even with a warrant, to force suspects to incriminate themselves by unlocking their devices with their biological features. Previously, courts had decided biometric features, unlike passcodes, were not “testimonial.” That was because a suspect would have to willingly and verbally give up a passcode, which is not the case with biometrics. A password was therefore deemed testimony, but body parts were not, and so not granted Fifth Amendment protections against self-incrimination.

Declaring that “technology is outpacing the law,” the judge wrote that fingerprints and face scans were not the same as “physical evidence” when considered in a context where those body features would be used to unlock a phone.

“If a person cannot be compelled to provide a passcode because it is a testimonial communication, a person cannot be compelled to provide one’s finger, thumb, iris, face, or other biometric feature to unlock that same device,” the judge wrote. “The undersigned finds that a biometric feature is analogous to the 20 nonverbal, physiological responses elicited during a polygraph test, which are used to determine guilt or innocence, and are considered testimonial.”

There were other ways the government could get access to relevant data in the Facebook extortion case “that do not trample on the Fifth Amendment,” Westmore added. They could, for instance, ask Facebook to provide Messenger communications, she suggested. Facebook has been willing to hand over such messages in a significant number of previous cases.

Law finally catching up with tech?

Over recent years, the government has drawn criticism for its smartphone searches. In 2016, Forbes uncovered a search warrant not dissimilar to the one in California. Again in the Golden State, the feds wanted to go onto a premises and force unlock devices with fingerprints, regardless of what phones or who was inside.

Andrew Crocker, senior staff attorney at the digital rights nonprofit Electronic Frontier Foundation, said the latest California ruling went a step further than he’d seen other courts go. In particular, Westmore observed alphanumeric passcodes and biometrics served the same purpose in unlocking phones.

“While that’s a fairly novel conclusion, it’s important that courts are beginning to look at these issues on their own terms,” Crocker told Forbes. “In its recent decisions, the Supreme Court has made clear that digital searches raise serious privacy concerns that did not exist in the age of physical searches—a full forensic search of a cellphone reveals far more than a patdown of a suspect’s pockets during an arrest for example.”

The magistrate judge decision could be overturned by a district court judge, as happened in Illinois in 2017 with a similar ruling. The best advice for anyone concerned about government overreach into their smartphones: Have a strong alphanumeric passcode that you won’t be compelled to disclose.

E-mail: snelson@senseient.com   Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson

Read the whole story
josephwebster
2 hours ago
reply
Denver, CO, USA
Share this story
Delete

Insurer Marsh Launches Cyber Self-Assessment Tool

2 Comments

On January 2nd, global insurer Marsh announced the launch of what it calls its next-generation online cyber self-assessment tool. The tool incorporates the latest insights on cybersecurity best practices to provide clients a robust cybersecurity program diagnostic and serve as a single application for cyber insurance, streamlining the procurement process.

The enhanced cyber assessment uses information on organizational cybersecurity controls, technology, and people to identify strengths and flag areas of potential concern for underwriters, which Marsh helps clients proactively address prior to underwriting discussions. Marsh combines the assessment with company-specific threat analyses and loss quantification inputs to provide a comprehensive diagnostic of a client’s cyber risk profile to support data-driven cyber risk program decisions.

Housed on an easy-to-use, secure web platform, the self-assessment tool streamlines clients’ cyber insurance application process by:

  • Allowing for simultaneous contribution by multiple organizational stakeholders, easing client workflow, and eliminating the challenge of version control prevalent in the use of shared, offline documents.
  • Serving as a single submission to multiple cyber insurance insurers, eliminating the inefficiencies and resource burdens of multiple carrier applications.

“In today’s fast-evolving cyber risk landscape, firms want to be able to gain greater insight into their cybersecurity preparedness,” said Thomas Reagan, US Cyber Practice Leader, Marsh. “Marsh’s enhanced online cyber self-assessment provides clients a comprehensive view of their cybersecurity program maturity, coupled with a streamlined, easy-to-use cyber insurance application process.”

The online cyber self-assessment tool is currently available to Marsh US clients, with a global rollout planned for 2019.

More information about Marsh’s online cyber self-assessment tool can be found on Marsh’s cyber website.

I suspect this means you will be giving a ton of information to Marsh and incurring significant costs to remedy deficiencies in order to get cyberinsurance at a reasonable price. And yes, I expect this to be a trend among insurers, who need to assess risk before they can write a policy.

E-mail: snelson@senseient.com   Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson

Read the whole story
josephwebster
12 days ago
reply
You had to know this was coming.
Denver, CO, USA
Share this story
Delete
1 public comment
StunGod
12 days ago
reply
This is a very good idea. When I worked in banking, I tried very hard to get our loan underwriters to consider an applicant's IT practices and security practice as part of the lending criteria. Somebody with crap security is more likely to be hacked and lose their ability to pay, and that should be taken into account.

Commercial lenders want to see that you're using an accountant for your taxes and a lawyer for your contacts, and that you're following standard accounting practices, zoning rules, etc. You've got to have adequate physical security to prevent your inventory from being stolen, but cybersecurity is just not something of interest for some reason.

I guess insurers are tired of writing checks to dumbfucks who let their kids play games on the same computer they run payroll...
Portland, Oregon, USA, Earth

Death and Digital Assets

2 Shares

Over the holidays, we lost a dear friend in the legal community and were once again reminded that we need to keep stressing the need for digital asset planning. Our friend knew he had cancer, but he believed he was going to beat the cancer - and he certainly never imagined a precipitous decline leaving his mind foggy and unable to speak or write.

As a result, there is a great deal of financial and other important information residing on his iPad. His grieving wife doesn't know whether he wrote any passwords down (she's looking) and she doesn't know his Apple ID. We hope that, between Virginia's digital asset law and our digital forensics capabilities, we may be able to help her.

But this whole nightmare could have been avoided by consulting with an estate lawyer who understands digital assets. Much of it also could have been avoided if our friend had used a password manager and given her the password to get into it.

The grief of losing a spouse is hard enough - adding all the worries stemming from our use of technology to manage our finances and many other aspects of our lives creates a terrible unintentional burden.

If you have not adequately addressed your digital assets, please make a New Year's resolution to do so - we are seeing way too many grieving clients who, in the midst of their distress, come to us to get information from digital devices that they don't know how to access.

E-mail: snelson@senseient.com   Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson

Read the whole story
christophersw
16 days ago
reply
Baltimore, MD
josephwebster
16 days ago
reply
Denver, CO, USA
Share this story
Delete

Drought reveals a long-submerged Colorado town on floor of Blue Mesa Reservoir

1 Comment and 2 Shares

GUNNISON COUNTY, Colo. — The cold is just the cold. Bob Robbins and Bill Sunderlin know it all too well.

Negative 2? That’s nothing, says Sunderlin, 78. “Child’s play,” says Robbins, 69.

Their feet crunch the snowy ground floor of the Blue Mesa Reservoir, the land they know as Iola, the town that was here before the dam’s construction starting in 1962. They were among ranching families whose homes would be submerged by the dark depths of Colorado’s largest body of water. Iola was sacrificed with the bigger Sapinero and smaller Cebolla.

The people are gone, most anyway — Robbins and Sunderlin don’t know of many others who stayed like them, settling in higher ground nearby. And the town is gone, though not entirely.

Christian Murdock, The Gazette via AP
In this photo, attached to an old piece of wood in Bill Sunderlin’s shop, shows his family resort along the Gunnison River near Iola, Colo., before the valley was floated in the 1960s to create the Blue Mesa Reservoir.

Old foundations, semblances of fence lines and corroded remains of farm life have returned to daylight. As a devastating drought dragged on this summer, as the Blue Mesa dropped to this century’s lowest levels, Iola re-emerged, enthralling Western Slope dwellers who didn’t know of its sunken existence.

As the stark reality of a drier climate settles in, they find a refreshing distraction in Iola. But Robbins and Sunderlin, they only find harsh reminders.

“Well, we’ve had our ups and downs over the years,” says Sunderlin, his voice low and grainy over the chilly gust.

The downs come when the water is low enough to see what they lost. The memories come flowing back, as they did for Robbins’ mother in 2012, he recalls, back during the last driest year before this one, back before she died. “She just bawled like a baby,” he says.

It’s all coming back now on this visit.

The wavy fields of hay. The shimmering stream beneath the cottonwoods, and the fish the size of monsters in a little boy’s eyes. The neighing of horses and lowing of cattle, and the endless sky that blushed above the hills.

How green everything was. How they hid around the rock up there, where the teacher couldn’t find them. The song they sang: “We’re in our places with sun-shining faces …”

And, oh yes, the cold. The blistering cold. Those cold, long days of work and those nights by the fire.

That was cold. Not today. It’s gotten warmer all right, they say.

The temperature has nothing to do with the pain they’re feeling now.

“If people ask me, I will come out here and talk to ’em about it,” says Robbins, as he and Sunderlin have done for reporters this season, because they can’t let the past die. “But no, I don’t just come out here and sit. It’s just … yeah … it’s very painful.”

Theirs are a couple of stories collected by David Primus over his years of researching life before the Blue Mesa. A community engagement facilitator at Western Colorado University and longtime Gunnison resident, he has presented regularly at the local library. Crowds always show.

“It’s not me,” Primus says. “It’s because it’s not there.”

Christian Murdock, The Gazette via AP
In this Tuesday, Dec. 4, 2018 photo, tools from the past lie on the remains of a shop foundation where the town of Iola, Colo., was along the Gunnison River before the Blue Mesa Reservoir flooded the valley in the 1960s.

Wide-eyed young people, people with no memory of the valley before the flood, tell him their drives west along the shore to Montrose never will be the same.

“If you’re older than 65, and you’re local, you remember it,” Primus says. “So for older people, it’s just being reminded of what was lost. I’ve had several people come up to me afterwards, this one woman I remember the most. She said, ‘I almost didn’t come, because I didn’t think I could handle it.’”

The number of people displaced is uncertain. Primus guesses between 200 and 300.

“Put it this way,” Sunderlin utters, “it wasn’t enough to fight the government.”

The prospect of hydroelectricity, as well as storage and mass recreation, grew in the minds of regulators. So grew a dark cloud over the hayfields.

“The problem was the resistance was just local here,” Robbins says, recalling the populations in either direction taking on the role of bystander. “They promised them cheap electricity for the rest of their lives. They were more than willing to have a pond out here rather than a river.”

Perhaps they figured the human casualty would be minimal.

“Not that many in the big picture,” Primus says. “But for those 200, 300 people, they lost their livelihoods, often times lost ranches that had been in the family for four generations.”

Robbins is an example, his ancestors having homesteaded the AK Stevens Ranch in the 1870s. Sunderlin grew up about 2 miles downstream at the Tex Lodge Ranch Resort, one of several tourist outfits rooted in the valley.

Best-known was the Sportsman’s Home, hosting the likes of John Wayne and Herbert Hoover over the years. The fishing took on a mythical quality, some of the best catches of the West reported right here. Primus learned of a tradition: Far upstream at a Sapinero hotel, a boy would report the spring larva hatch, and his father would send word across the land, attracting far away visitors.

It wasn’t only fishing that put Iola on the map. After the railroad came in 1881, the town became a pivotal stop for loading cattle on their way to market. Robbins and Sunderlin remember leaving school to help, and the nickel-priced reward of a candy bar at the convenience store.

Then, the road came, like an alarm sounding.

“You could hear ’em,” Robbins says, looking out to the blown corners of the canyon. “It was like World War III.”

And the trees. What they did to the trees. “It looked like they were just dropping bombs on ’em. Just sticks, sticking up.”

Sunderlin remembers the machines. How they got bigger, fiercer. “When they were clearing here, they had a D8 with a shear blade, a big blade that curled down, chopped the trees. Then they came in with a D7.”

Robbins distinctly remembers the summer. The summer of ’63. “That summer, we went ahead and put the hay up, baled it and sold it all. The year before, this was still the home ranch.”

The house was burned. Mom and Dad couldn’t watch, neither could the teenaged Robbins, but Grandma did. Auctions were held all over, farmers flocking for equipment, collectors for antiques.

Some buildings were saved, including the schoolhouse, which now sits just over the hill, the modified home of Sunderlin. He keeps pictures there. Him with his horse, Chico. Him with his dog, Bingo. Him as a smiling kid with that “sun-shining face.”

And Robbins keeps pictures, too, taken with the camera some down-on-his-luck passerby left Dad in exchange for 5 gallons of gas. Robbins never thought to use it, until it was all coming to an end. “If I knew then what I know now, I would’ve taken pictures all up and down this valley.”

Robbins and Sunderlin can see it now, the paradise before this hard, blank canvas.

They stop at the cement base of a flag pole, on which they still can make out their initials. They move on to the cut legs of a windmill. At a foundation that they believe was a barn, they sift through mangled tools, too contorted to determine their former purpose. But how amazing to find them still.

“You know, they made it very clear to us that this ground would be a mud flat,” Robbins says. “I mean, they didn’t envision this all to be here right now. They expected mud. They thought this would cover up so fast that nobody would ever come out and see it.”

The thought lingers for a moment, silence but for the cold wind. Iola’s sons trudge back to their trucks, leaving footprints soon to fade in the snow. And the sun shines on their faces, but they’re not smiling now.

___

Information from: The Gazette, http://www.gazette.com



Read the whole story
josephwebster
29 days ago
reply
There's a number of large reservoirs in Colorado with similar stories. Including Horsetooth reservoir in Ft. Collins where I grew up.
Denver, CO, USA
christophersw
24 days ago
reply
Baltimore, MD
Share this story
Delete

The Problem of Package Manager Trust

1 Comment and 4 Shares

Package managers are among the most valuable tools in a developer’s toolkit. A package can inject hundreds to thousands of lines of useful code into a project that a developer would otherwise have to write by hand. Ain’t nobody got time for that!

Of course, such tools do not come without risk as highlighted by the event-stream package incident.

Streams are node’s best and most misunderstood idea, and EventStream is a toolkit to make creating and working with streams easy.

This is a very popular NPM package with 1,592 downstream dependents and 1.9 Million weekly downloads (at the time I wrote this). Unfortunately, the most recent release of the package contained a malicious dependency which was not discovered for around two months. It was reported in this GitHub issue.

How did this happen? Well the original maintainer was no longer interested in maintaining the package and handed it off to another contributor who had previously made real contributions to the package. It appears that these contributions were made to gain trust so as to gain access to publishing the package. Unfortunately, this new contributor was a bad actor, and not the Jean-Claude Van Damme variety of bad actors.

Put down the pitch forks

As you can imagine, many took to their pitchforks and directed a lot of blame and vitriol towards the maintainer. Just stop.

Blaming the maintainer is mean and doesn’t accomplish anything useful. It’s also a sign of intense immaturity.

Other industries learned a long time ago that a culture of blame doesn’t improve results. Mature teams think beyond the individual and apply the approach of a blameless postmortem.

If we go with “blame” as the predominant approach, then we’re implicitly accepting that deterrence is how organizations become safer. This is founded in the belief that individuals, not situations, cause errors. It’s also aligned with the idea there has to be some fear that not doing one’s job correctly could lead to punishment. Because the fear of punishment will motivate people to act correctly in the future. Right?

The aviation industry realized that individual blame didn’t improve safety, but a focus on human factor design applied to the problem would. We live in an increasingly complex world with increasingly complex systems. It’s unreasonable to expect everyone will do the right thing every time. As The Checklist Manifesto notes,

The volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely, or reliably. Knowledge has both saved us and burdened us.

Even if you think that expectation is reasonable, it’s foolish to trust in such a system that’s entirely dependent on that expectation. This is why I’m very glad that pilots follow checklists rather than just rely on their memory when I’m a passenger.

This is not an isolated problem

It’s also important to understand, as Dominic points out in his statement on the incident, that this is not an isolated problem.

Hey everyone - this is not just a one off thing, there are likely to be many other modules in your dependency trees that are now a burden to their authors.

He goes on to highlight the big challenge for maintainers.

So right now, we are in a weird valley where you have a bunch of dependencies that are “maintained” by someone who’s lost interest, or is even starting to burnout, and that they no longer use themselves. You can easily share the code, but no one wants to share the responsibility for maintaining that code.

This could happen to a maintainer you know and love. This could happen to you.

How bad was it?

Before we talk about solutions, it might also help to put this particular incident in perspective. The NPM team published an incident report that describes how the vulnerability works. The malicious package only attacks users of the Copay Bitcoin wallet who have over 100 Bitcoin (BTC) or 1000 Bitcoin Cash (BCH). In USD that’s around $600K…wait no…$500K…actually it’s now $400K…still a lot of money. The afflicted are probably a pretty small subset of the population.

Also, it helps to understand that NPM has about 8 Billion weekly package downloads. event-stream accounts for about 0.024% of those package downloads. So large in scale, but tiny in terms of overall package downloads. Of course this doesn’t account for the number of potentially malicious packages we don’t know about. The point here is it’s still pretty difficult to get someone to download your malicious package.

What’s the solution?

Ok, now that you put down your pitchforks (which is curious to me because how many of you are actually farmers?), let’s talk about solutions.

So what do we do about it? I don’t believe there’s a magic solution out there, but I think there are some mitigations worth discussion.

In a previous life, I was part of a team responsible for the NuGet package manager. Thinking about these sort of attacks kept me up at night (but now I sleep like a baby). The approach I wanted to take at the time was to focus on identity, reputation, and webs of trust. I believe many of those ideas still apply today, but they don’t necessarily solve for this specific case. At least, not without a small adjustment.

NuGet recently added certificate signing of packages. This allows consumers to specify they only want to install packages of verified people. This also doesn’t address this particular problem for two reasons. Not every package maintainer will bother with the certificates. And even if they do, NuGet supports packages signed by an org. So if a burnt-out maintainer adds a bad actor to their org, all bets are off.

What my previous threat models missed is that we treat every package the same whether it’s depended on by a hundred people or a hundred million people. This ignores the fact that the threat model for these repositories are not the same!

For example, changing the owner (or giving someone else rights to publish) for my barely-used package does not have the same threat impact than changing the owner to my super-popular-left-pad package used by half of the internet.

There’s a few things package managers might consider in this situation (in addition to the ideas I wrote about in my Trust and NuGet post).

  1. Consider a change of owner as a SemVer breaking change. At least that would prevent the package from aggressively being updated in most package managers.
  2. When changing owners (or giving publish access), provide easy to understand reputation information for very popular packages. Maybe even block ownership tranfer on extremely popular packages to suspect individuals. The point here is to leverage reputation and trust information in some useful way.
  3. Provide education, tools, and support to burnt-out maintainers. Work with open source foundations so that they can take over and perhaps vet and find maintainers for projects that maintainers want to dump. Don’t leave the entire burden on maintainers of projects that became way more popular than they anticipated. We need to share that burden.

This is an area where GitHub (full disclosure: I’m a former GitHub employee) could really take a lead in concert with the various package managers. GitHub has a wealth of information about repositories, their dependencies, and the people who work on them. This information could help maintainers make better choices, if it was integrated with the information that package managers have on hand. This would require deep cooperation between package managers and GitHub. I think this would be a very good thing.

What about paying maintainers?

One solution a lot of folks bring up is paying maintainers to maintain these packages. The thinking goes that there are a lot of companies flush with money who get immense benefit from open source without giving back. Why shouldn’t they contribute to the maintenance of these packages?

Of course I’m all for systems that help maintainers get paid for the work they do. At the same time, I’m skeptical that this will actually solve the trust problem.

Often, the resource that’s really scarce for maintainers is time, not money. A lot of maintainers work on their open source projects on the side while holding down a full-time job. Part of the cause of burn out is the additional stress of working on a bunch of side project on the weekend. A few grand extra doesn’t necessarily solve that problem. Their day job is unlikely to let them work less hourse because they have a side project.

The only way this works is if maintainers are paid enough to quit their day jobs and maintain their open source projects full-time. This works great if you’re the maintainer of Linux, but probably not sustainable for someone who maintains a handful of small packages.

This problem is a classic tragedy of the commons example.

The tragedy of the commons is a term used in social science to describe a situation in a shared-resource system where individual users acting independently according to their own self-interest behave contrary to the common good of all users by depleting or spoiling that resource through their collective action.

Some might object to this depiction because what resource is being spoiled by using a package? Every download of a package doesn’t take anything away from the maintainer. In practice, it’s the package author’s attention that’s depleted. The more people who use a package, the more issues the maintainer has to wade through. It becomes a big slog.

What I want to see is a large scale communal ownership attitude by companies towards open source. We need to take care of our commons, pitch in, find solutions. Don’t just contribute a bit of cash to maintainers, but work together to provide maintainers work time to maintain their packages. Find creative solutions to both the time and money scarcity problem.

Read the whole story
josephwebster
49 days ago
reply
Package managers have always been a weak link in software development security because using them involves a level of trust that is unwarranted. Unfortunately without package managers most modern software projects would not be tractable or even possible. The central idea of this article is that the risks associated with these packages are not equal and effective mitigation is decidedly nontrivial.
Denver, CO, USA
JayM
19 days ago
reply
Atlanta, GA
Share this story
Delete

This photo should not exist

1 Share

pin.it/fnnc4j6fjamugy Once we get past the creep factor of Nazi army uniforms, we see a communications team sending a secret message. They are using the legendary Enigma machine to encrypt the message. But why, why did that officer allow a photographer to record this highly sensitive activity? A failure of operational security (OPSEC). Allies in … Continue reading This photo should not exist

The post This photo should not exist appeared first on Security Boulevard.

Read the whole story
josephwebster
67 days ago
reply
Denver, CO, USA
Share this story
Delete
Next Page of Stories