now browsing by tag
COVID-19: Here’s How Long Average NY Single Would Travel To Go On Date During Pandemic | #tinder | #pof | romancescams | #scams
The average single person in New York would travel 2.1 hours to go on a date amid the COVID-19 pandemic, according to a survey conducted by EverydayCarry.com, and half of […] View full post on National Cyber Security
Sending children back to school would worsen COVID-19 problem | #covid19 | #kids | #childern | #parenting | #parenting | #kids
County’s virtual plan has enough class time | Published: 2020-08-31 10:09 To the editor: I am a junior at Bethesda-Chevy Chase High School, responding to the letter from Darci Rochkind […] View full post on National Cyber Security
#sextrafficking | Advocates say youth shelter in Truro would protect teens from human trafficking | #tinder | #pof | #match | romancescams | #scams
The recent case of a man who was unlawfully at large in the Truro, N.S., area and allegedly committed dozens of sex and drug trafficking crimes against children shows the need for a local youth shelter is dire, says a youth advocate.
Michelle Rafuse, a volunteer who supports First Nations youth in court, said a shelter for young people would help prevent at-risk youth from becoming victims of violence and sexual exploitation.
“There’s no place for kids to go in Truro if they need a place to stay,” said Rafuse, who often allows homeless kids to stay at her own home.
“If they have no place to go, they end up in circumstances where they could get led down a path they don’t want to be on.”
Truro and the surrounding Colchester County do not have a youth shelter. The counties of Digby, Yarmouth, Shelburne, Pictou and Halifax all have youth shelters serving their areas.
Youth shelters are usually run by not-for-profit groups and are aimed at ending homelessness for people aged 16 to 24. The youth stay for several months and receive connections and support to help them get their lives on track.
CBC News spoke to a 22-year-old Indigenous man who said he spent the last several years homeless in Truro. He said he used to walk around the town at night messaging friends and asking for a place to stay, often crashing at the homes of friends on their laundry room floors. If he couldn’t find a place to stay, he kept walking.
“It was weird sleeping outside, so I just stayed awake,” said the young man, who recently found housing because a member of the community offered up her home. “There’s a lot of people in the same boat. There’s a pretty big need for it.”
CBC News is not identifying the man because he has been a participant in the youth criminal justice system, involved in break and enters, which he said he did to get money to support himself. He said he wouldn’t have committed those crimes had he not been so desperate and had a safe home.
He said his troubles started in his teens when his relationship with his father turned volatile. Upset about the fighting, he failed to turn up at his job baking cookies and bread at a local bakery. After losing the job, he said he was kicked out of the house because he could no longer pay the rent.
Truro has emergency shelter, but it’s not just for youth
Truro has a youth centre, which has been closed due to COVID-19, but it’s only open during the day. There is also an overnight emergency shelter open to youth over the age of 16.
Truro, with its population of 12,500 is a hub town, a crossroads where the Trans-Canada Highway joins from three different directions. The town is next door to the Millbrook First Nation, a Mi’kmaw community with many members living off-reserve.
A 2018 Statistics Canada study found Nova Scotia had the highest rate of human trafficking in the country in 2016.
Joe Pinto, a local developer and businessman, said he’s noticed the growing issue of youth homelessness in Truro.
“There seems to be a lot of kids that the parents are not available to look after them or they’re just on the street, couch-surfing, going from place to place. I feel that there’s a need to house them and give them a bit of guidance,” he said.
Pinto said he has space available in downtown Truro for a youth shelter if a community group is interested.
Social services and housing are provincial responsibilities. In a joint statement, the Department of Community Services and the Department of Municipal Affairs and Housing described having a place to live as an important piece of the complex problem of human trafficking.
To propose a youth shelter for the town, a community group would first have to submit a proposal, which could include a request for funding, to Nova Scotia’s Department of Municipal Affairs and Housing. So far, no such proposal has come forward.
What the province says it’s doing
The Nova Scotia government earmarked $1.4 million in new funding to combat human trafficking, some of which is trickling down to Truro. In the town, there is one housing support worker and a trustee who can help at-risk youth find secure, stable housing. The province said rent subsidies are available and the Truro Homeless Outreach Society can also connect people to safe and affordable housing.
Truro Mayor Bill Mills said he’s open to the idea of a youth shelter, but it’s going to be a tough sell right now to get funding from the municipality because everyone is being stretched.
“On the surface, if we could pull this off and have a youth shelter and the right people in place… sure, why not,” he said, adding that a letter to council would be the first step.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
View full post on National Cyber Security
CISOs are willing to sacrifice an average of $9,642, or 7.76% of their salaries, for better work-life balance – an elusive goal among those whose employers demand more of their time and effort.
In a study conducted by Vanson Bourne and commissioned by Nominet, researchers interviewed 400 CISOs and 400 C-suite executives to learn more about the toll of continued stress on the mental health and personal lives of security leaders, who have increasingly reported poor work-life balance and little board-level support. They discovered most (88%) CISOs they surveyed are moderately or tremendously stressed, slightly down from 91% in 2019.
Nearly half (48%) of CISOs say work stress has had a detrimental effect on their mental health, nearly double the 27% who said the same last year. Thirty-one percent report the stress has affected their physical health, 40% say it has affected relationships with partners and children, and almost one-third say it has affected their ability to do their jobs. Ninety percent of CISOs would take a pay cut if it meant they could have a more even work-life balance.
There is no single source to CISOs’ stress, but excessive hours are a major factor. Almost all CISO respondents (95%) work more hours than contracted, with an average of 10 extra hours per week. Eighty-seven percent say their employers expect them to work additional hours. Only 2% of CISOs say they can “switch off” when they leave the office, and 83% report they spend at least half of their evenings and weekends thinking about their jobs.
“At my level, at even more junior levels, there’s an expectation that we’re always on,” says Nominet vice president of cybersecurity Stuart Reed. “There is this notion of never really switching off for any long period of time.” All of these extra hours add up: Ten extra hours of work each week amounts to $30,319 in extra time CISOs give their organizations each year.
Security leaders are expected to wear many hats during those hours. “CISOs are very much expected to be experts not just from a technical perspective, but being able to translate those technical concepts into the business risk or business strategy concepts,” Reed says. “The very blended nature of their role means they are potentially taking on the responsibility of more than one person’s job.”
It’s impossible to decouple CISOs’ stress from the evolving threat landscape. Mainstream news coverage of major cyberattacks puts an ever-growing spotlight on the CISO, explains Gary Foote, CIO of the Haas Formula One racing team, who also handles security for his employer. As soon as an organization gets media attention for a data breach, it escalates to the board level.
“That gets their attention, and they’re going down to the CISO and saying, ‘You have to make sure this doesn’t happen to us,”https://www.darkreading.com/” Foote says. “A good amount of C-suite executives will see an attack as inevitable, but there will always be a significant portion that don’t.” Nominet’s study found 24% of CISOs report their boards don’t view security breaches as inevitable.
Bonding with the Board
Researchers discovered a telling gap between CISOs and the C-suite when it comes to CISO responsibilities and expectations. The board does take cybersecurity seriously – 47% say it’s a “great” concern – and 74% say their security teams are moderately or tremendously stressed.
The C-suite may recognize the importance of cybersecurity and appreciate CISOs’ stress, but it doesn’t translate into greater CISO support. Just about all (97%) of the C-suite say the security team could improve on delivering value for the amount of budget they receive. This indicates that despite their additional hours worked, the C-suite thinks they should still be doing more.
Demonstrating return on investment has long been a challenge for security teams. A low investment in cybersecurity could result in zero incidents; a high investment may still result in a breach. It’s difficult to prove return on investment when the measure of success is a breach that doesn’t happen. The challenge, says Foote, is trying to relay this to a corporate board.
Both CISOs (37%) and the C-suite (31%) say the CISO is ultimately responsible for responding to a data breach. Nearly 30% of CISOs say the executive team would fire the responsible party in the event of a breach; 31% of C-suite respondents confirmed this. Twenty percent of CISOs say they would be fired whether or not they were responsible for the incident.
Kelly Sheridan is the Staff Editor at Dark Reading, where she focuses on cybersecurity news and analysis. She is a business technology journalist who previously reported for InformationWeek, where she covered Microsoft, and Insurance & Technology, where she covered financial … View Full Bio
The post 90% of CISOs Would Cut Pay for Better Work-Life Balance appeared first on National Cyber Security.
View full post on National Cyber Security
#comptia | #ransomware | Bill Would Make Possession Of Ransomware A Crime In Maryland – CBS Baltimore
Source: National Cyber Security – Produced By Gregory Evans CAPITAL NEWS SERVICE — State lawmakers heard arguments Tuesday on a bill that seeks to add criminal penalties for knowingly possessing ransomware with the intent to use it in a malicious way. Ransomware is a type of malware that can impede the use of a computer […] View full post on AmIHackerProof.com
#nationalcybersecuritymonth | Security experts explain why unlocking the Pensacola shooter’s iPhones would unleash a privacy nightmare for iPhone owners
- Apple’s decision not to unlock or create a backdoor into the iPhones used by a gunman in a Florida shooting last month puts the tech giant at odds with the United States government yet again.
- Security experts agree, however, that circumventing the iPhone’s security poses a significant risk to iPhone users since it would provide a means to obtain private data that even Apple can’t presently access.
- There’s a risk that such a tool could fall into the wrong hands, some experts warn.
- Visit Business Insider’s homepage for more stories.
Attorney General William Barr recently called on Apple to help unlock the iPhones used by a gunman in Pensacola, Florida last month – a situation that once again requires the tech giant to balance protecting consumer privacy with its legal obligation to assist in investigating a shooting that’s resulted in the loss of American lives.
But security experts agree that providing access to the shooter’s iPhone could jeopardize the security of the millions of iPhones in use around the world.
„In essence, you’re trying to make a weapon that can only be used on a single target,“ Jacob Doiron, an information systems lecturer at San Diego State University, said to Business Insider. „But that’s not the nature of weapons, or exploits. They are applicable to any device that has that profile or configuration.“
On Monday, Barr said that Apple had not provided any „substantive assistance“ in getting access to two iPhones belonging to the shooter, Mohammad Alshamrani, who killed three people at a naval airbase last month. But Apple has since refuted that characterization, saying that it had provided iCloud backups, information, and other data from Alshamrani’s account in cooperating with the investigation. Now, Apple is reportedly gearing up for a legal battle with the Department of Justice to defend its position, according to The New York Times.
„We have always maintained there is no such thing as a backdoor just for the good guys,“ Apple said in a comment to Business Insider. „Backdoors can also be exploited by those who threaten our national security and the data security of our customers.“
Apple took a similar position in 2016 when it was caught in a stand-off with the Federal Bureau of Investigation over whether it should unlock an iPhone linked to a shooting in San Bernardino, California. Apple refused to unlock the iPhone, and the FBI ultimately ended up working with a private companyto gain access to the device.
The crux of the issue when it comes to unlocking an iPhone or bypassing its encryption , according to privacy experts, is that once Apple creates a backdoor, there’s a risk that it can be used in unpredictable and in some cases harmful ways.
„I would say the chances of it falling into the wrong hands are 100%,“said Mark Nunnikhoven, vice president of cloud research for cybersecurity firm Trend Micro.
There’s also the question of why Apple couldn’t just create the tool for the purposes of the investigation and then push an update to iPhones that would render it obsolete. For that to work, the backdoor would have to be tied to the software only, not the iPhone’s hardware, says Doiron. „Sometimes these vulnerabilities take place on the hardware, level,“ he said. „That’s not something that could be fixed via software.“
„We’re on your side“
The broader issue, however, may be that creating such a tool would put private, encrypted data from iPhone users in the hands of Apple and its employees – a privilege the company doesn’t want to begin with. Such a move that would be in stark opposition to Apple’s stance on consumer privacy.
„You are not our product,“ Apple CEO Tim Cook said in an interview with ABC News last year. „Our products are iPhones and iPads. We treasure your data. We want to help you keep it private and keep it secure. We’re on your side.“
Foto: Apple CEO Tim Cook.sourceREUTERS/Toru Hanai
Theoretically, if Apple were to create some type of tool or key that would provide backdoor access to encrypted iPhone data, employees from Apple would have access to that information as well since they would likely be assisting in the investigation. What’s to prevent an Apple worker from going rogue and possibly leaking iPhone user data, or using the tool for nefarious purposes?
Nunnikhoven pointed to EternalBlue as an example of how a tool built for specific purposes could fall into the wrong hands. EternalBlue was a National Security Agency hacking tool that leaked to the public in 2017 that was linked to the WannaCry ransomware attack that infected computers all over the world during that same year.
Creating the tool in general would also require a significant effort on Apple’s part. It’s not simply about cracking the passcode of the device, but would likely require that a dedicated team at Apple create a piece of software capable of accessing the data stored on the device, says Nunnikhoven. The government, in other words, is asking Apple to enable something that isn’t even possible on iPhones today.
Unlocking these iPhones for the Pensacola investigation would also likely set a precedent for law enforcement agencies to request similar treatment for future cases as well, says Matt Wilson, chief information security advisor at BTB Security.
„It’s just more evidence to prove this isn’t just [cybersecurity experts] saying, ‚I don’t want to think about it,’“ said Wilson. „It’s [experts] saying we’ve thought about it very long and very hard, and we don’t see a viable way that addresses all of these issues.“
The post #nationalcybersecuritymonth | Security experts explain why unlocking the Pensacola shooter’s iPhones would unleash a privacy nightmare for iPhone owners appeared first on National Cyber Security.
View full post on National Cyber Security
For years, organisations have been using a common tactic called the warrant canary to warn people that the government has secretly demanded access to their private information. Now, a proposed standard could make this tool easier to use.
When passed in 2001, the US Patriot Act enabled authorities to access personal information stored by a service provider about US citizens. It also let them issue gag orders that would prevent the organisation from telling anyone about it. It meant that the government could access an individual’s private information without that person knowing.
Companies like ISPs and cloud service providers want their users to know whether the government is asking for this information. This is where the warrant canary comes in. First conceived by Steve Schear in 2002, shortly after the Patriot Act came into effect, a warrant canary is a way of warning people that the organisation holding their data has received a subpoena.
Instead of telling people that it has been served with a subpoena, the organisation stops telling them that it hasn’t. It displays a public statement online that it only changes if the authorities serve it with a warrant. As long as the statement stays unchanged, individuals know that their information is safe. When the statement changes or disappears, they can infer that all is not well without the organisation explicitly saying so. Here’s an example of one.
A warrant canary can be as simple as a statement that the service provider has never received a warrant. The problem is that those statements aren’t standardised, which makes it difficult for people to interpret them. How can you be sure that a warrant canary means what you think it means? If it disappears, does that mean that the service provider received a warrant, or did someone just forget to include it somewhere? Does the canary’s death indicate a sinister problem, or did it just die of natural causes? This isn’t idle speculation – warrant canary changes like SpiderOak’s have confused users in the past.
The other problem is that these statements are designed to be read by people, which make them difficult to track and monitor at scale. That’s what the warrant canary standard would solve.
The proposed standard surfaced on Github on Tuesday. It was created by GitHub user carrotcypher, inspired by the work of organisations like the Calyx Institute (a technology non-profit that develops free privacy software) and the now-defunct Canary Watch, a project from the Electronic Frontier Foundation (EFF), Freedom of the Press Foundation, NYU Law, Calyx and the Berkman Center. Canary Watch listed and tracked warrant canaries. When it shut down Canary Watch, the EFF explained:
In our time working with Canary Watch we have seen many canaries go away and come back, fail to be updated, or disappear altogether along with the website that was hosting it. Until the gag orders accompanying national security requests are struck down as unconstitutional, there is no way to know for certain whether a canary change is a true indicator. Instead the reader is forced to rely on speculation and circumstantial evidence to decide what the meaning of a missing or changed canary is.
Canarytail seeks to change that. As it explains on its Github readme.md page:
We seek to resolve those issues through a proper standardized model of generation, administration, and distribution, with variance allowed only inside the boundaries of a defined protocol.
Instead of some arbitrary language on a website, the warrant canary standard would be a file created using the JSON language, which is notable for displaying data as a list of key:value pairs readable by both people and machines. The file would include 11 codes with a value of zero (false) or one (true). These codes include WAR for warrants, GAG for gag orders, and TRAP for trap and trace orders, along with another code for subpoenas, all of which will have specific legal implications for an organisation and its users. If the value next to any of these keys is zero, the person of software reading the file can infer that none of the warnings have been triggered. If the code changes to one, it’s cause for concern.
The file also contains some other interesting codes, including DURESS, which indicates that the organisation is being coerced somehow, along with codes indicating that they have been rated. There is also a special code indicating a Seppaku pledge, which is a promise that an organisation will shut down and destroy all its data if a malicious entity takes control of it.
In a smart bit of cryptographic manoeuvring, the proposed standard must be cryptographically signed with a public key, and includes information about the expiry date. It uses a block hash from the bitcoin blockchain to verify the freshness of the digital signature. As another safeguard, it includes a PANICKEY field with another public key. If the file is signed with this key, people can interpret it as a kill switch, causing the warrant canary to fail immediately. That’s useful if an organisation suddenly gets raided and can’t afford to wait until the current warrant canary file expires.
A standard like this could help revive warrant canaries by making them easier to track and more deterministic. In the meantime, plenty of non-standard warrant canaries have disappeared, including Reddit’s and Apple’s.
The post Proposed standard would make warrant canaries machine-readable – Naked Security appeared first on National Cyber Security.
View full post on National Cyber Security
Microsoft does more research and development in China than it does anywhere else outside the United States. But, as US-China relations continue to sour on issues of trade and cyber-security, the decades-long ties Microsoft has in China are coming under close scrutiny.
In an interview with BBC News, Microsoft’s chief executive Satya Nadella has said that despite national security concerns, backing out of China would “hurt more” than it solved.
“A lot of AI research happens in the open, and the world benefits from knowledge being open,” he said.
“That to me is been what’s been true since the Renaissance and the scientific revolution. Therefore, I think, for us to say that we will put barriers on it may in fact hurt more than improve the situation everywhere.”
Microsoft’s first office in China was opened by founder and then-chief executive Bill Gates in 1992. Its main location in Beijing now employs more than 200 scientists and involves over 300 visiting scholars and students. It is currently recruiting for, among other roles, researchers in machine learning.
In April, it was reported by the Financial Times that Microsoft researchers were collaborating with teams at China’s National University of Defence Technology, working on artificial intelligence projects that some outside observers warned could be used for oppressive means.
Speaking to the newspaper, Republican Senator Ted Cruz said: “American companies need to understand that doing business in China carries significant and deepening risk.”
He added: “In addition to being targeted by the Chinese Communist party for espionage, American companies are increasingly at risk of boosting the Chinese Communist party’s human rights atrocities.”
Technology as weapon
Mr Nadella acknowledged that risk.
“We know any technology can be a tool or a weapon,” he told the BBC.
“The question is, how do you ensure that these weapons don’t get created? I think there are multiple mechanisms. The first thing is we, as creators, should start with having a set of ethical design principles to ensure that we’re creating AI that’s fair, that’s secure, that’s private, that’s not biased.”
He said he felt his company had sufficient control over how the controversial emerging technologies are used, and said the firm had turned down requests in China – and elsewhere – to engage in projects it felt were inappropriate, due to either technical infeasibility or ethical concerns.
“We also recognise whether it’s in the United States, whether it’s in China, whether it’s in the United Kingdom, they will all have their own legislative processes on what they accept or don’t accept, and we will abide by them.”
‘Leaves me wondering…’
Matt Sheehan, from the Paulson Institute, studies the relationship between California’s technology scene and the Chinese economy. He said Microsoft’s efforts, particularly its Beijing office, have had tremendous impact.
“It dramatically advanced the field, advances that have helped the best American and European AI research labs push further,” he said.
“But those same advances feed into the field of computer vision, a key enabler of China’s surveillance apparatus.”
He cites one particular paper as highlighting the complexity of working with, and within, China. Deep Residual Learning for Image Recognition, published in 2016, was a research paper produced by four Chinese researchers working at Microsoft.
According to Google Scholar, which indexes research papers, their paper was cited more than 25,256 times between 2014-2018 – more than any other paper in any other field of research.
“The lead author now works for a US tech company in California,” said Mr Sheehan, referring to Facebook.
“Two other authors work for a company involved in Chinese surveillance. And the last author is trying to build autonomous vehicles in China.
“What do we make of all that? Honestly, it leaves me – and I think it should leave others – scratching their heads and wondering.”
Follow Dave Lee on Twitter @DaveLeeBBC
Do you have more information about this or any other technology story? You can reach Dave directly and securely through encrypted messaging app Signal on: +1 (628) 400-7370
The post #computersecurity | Blocking research with China would ‘hurt’, Microsoft boss says appeared first on National Cyber Security.
View full post on National Cyber Security
To Purchase This Product/Services, Go To The Store Link Above Or Go To http://www.become007.com/store/ Raising the minimum wage by $1 per hour would result in a substantial decrease in the number of reported cases of child neglect, according to a new study co-authored by an Indiana University researcher. Congress is…
The post Raising the minimum wage would reduce child neglect cases appeared first on Become007.com.
View full post on Become007.com