now browsing by tag
Human remains found in Saskatoon, case deemed homicide | #missingkids | #parenting | #parenting | #kids
This news feed is updated periodically. If the story you are looking for isn’t at the top, please scroll down. Human remains found in Saskatoon, case deemed homicide UPDATE: An […] View full post on National Cyber Security
#sextrafficking | Fact check: Media was not “silent” about the NXVIM case | #tinder | #pof | #match | romancescams | #scams
Users on social media are sharing the screenshot of a 2019 tweet that misleadingly alleges the media did not report the NXVIM case, a U.S. sex cult founded by Keith […] View full post on National Cyber Security
#sextrafficking | Suspect in Sarah Lawrence Cult Case Is Accused of Witness Tampering | #tinder | #pof | #match | romancescams | #scams
In 1998, he was the best man at the wedding of Bernard B. Kerik, the former New York police commissioner. In a New York magazine article last year, Mr. Kerik […] View full post on National Cyber Security
#sextrafficking | January trial set for Grand Island sex trafficking case | #tinder | #pof | #match | romancescams | #scams
GRAND ISLAND, Neb. (KSNB) – A January trial is scheduled for a Grand Island man accused of drugging and beating women he sold for sex. 33-year-old Hassan Aden faces two […] View full post on National Cyber Security
#sextrafficking | Virgin Islands Will Subpoena Billionaire Investor in Epstein Case | #tinder | #pof | #match | romancescams | #scams
_________________________ Officials in the U.S. Virgin Islands want the billionaire investor Leon Black, one of the most powerful men on Wall Street, to hand over information about his decades-long business […] View full post on National Cyber Security
How COVID-19 case rates will affect each Minnesota school district’s reopening plan | #coronavirus | #kids. | #children | #parenting | #parenting | #kids
Minnesota’s guidelines for how public schools can operate during the upcoming school year, which were announced Thursday, can be summed up in two words: it’s complicated. The ultimate decision about […] View full post on National Cyber Security
Source: National Cyber Security – Produced By Gregory Evans Of all the headaches CISOs deal with daily (and we know there are many!), making a hard-fought case for an appropriate security budget is one they often have to contend with annually. While security and risk mitigation are certainly receiving more attention and priority these days, […] View full post on AmIHackerProof.com
Source: National Cyber Security – Produced By Gregory Evans Companies choose to transition to serverless computing for various reasons, mainly being faster time-to-market and reduced infrastructure costs. However, the root cause of their serverless security needs differ based on a myriad of factors. In this use case we will highlight a team struggling with traditional […] View full post on AmIHackerProof.com
Tokenization can be used to protect any sensitive data within an organization, with little overhead
One of the most difficult tasks in information security is protecting sensitive data across complex, distributed enterprise systems as well as a mix of legacy systems and cloud-based applications, all blended in with critical business requirements. Added to the ever-present risk of data breaches are the various privacy laws and regulations, such as GDPR or the California Consumer Privacy Act.
When implemented properly, encryption is one of the most effective security controls available to enterprises, but it can be challenging to deploy and maintain across a complex enterprise landscape. Worse, sensitive data still can be taken advantage of should it be subject to a breach. Fortunately, there are other data protection options that enterprises can implement with far less disruption—namely, tokenization.
Increasingly, enterprises are turning to tokenization because it offers a stateless, data-centric approach with fewer security gaps and risks. With tokenization, security travels with the data while it’s at rest, in use and in motion. As a result, no additional security methods are needed to provide protection when the data leaves the enterprise.
Tokenization accomplishes this by replacing the original sensitive data with randomly generated substitute characters as placeholder data. These random characters, known as tokens, have no intrinsic value, but they allow authorized users to retrieve the sensitive data when needed. If tokenized data is lost or stolen, it is useless to cybercriminals. The tokenized data can also be stored in the same size and format as the original data. This is ideal for enterprise environments—especially those with legacy systems—since the tokenized data requires no changes in database schema or processes.
The use of tokenization also minimizes data exposure. Applications generally will use tokens and only access real values when absolutely necessary. Although tokenization is most typically associated with credit cards, it is applicable to virtually every industry, for data types such as Social Security numbers, birth dates, passport numbers and account numbers. Through the use of network-level and REST APIs, tokenization can be integrated into a variety of different enterprise environments.
Tokenization is also a secret weapon for organizations with heavy compliance burdens. Financial institutions, for instance, are often responsible for securing millions of account holder credentials in data infrastructures that are subject to PCI DSS regulations. Tokenizing as much data as possible allows these organizations to ease their compliance burdens as tokens are not generally within the scope of audits.
With the advent of vaultless tokenization, the implementation of tokenization in the enterprise is now a relatively straightforward affair. Legacy methods of “vaulted” tokenization require maintaining databases with tokens and their corresponding real data. These token vaults represent a high-risk target for theft. Furthermore, large token vaults often present complex implementation problems, particularly in distributed, worldwide deployments. One could argue that the implementation challenges surrounding vaulted tokenization are a primary reason why enterprises continue to leave sensitive data vulnerable to cyberattackers.
No Vault Database to Maintain
In contrast, vaultless tokenization is safer and more efficient while offering the advantage of either on-premises or cloud deployment. In this model, a hardware security module (HSM) is used to cryptographically tokenize data. This data can then be detokenized, returning the appropriate portion of a record, for use by authorized parties or applications. In this model, there is no token vault or centralized token database to maintain.
The information security principle of least privilege dictates that organizations limit access to sensitive data to solely what an individual needs to do their job. Any additional access is an unnecessary exposure of sensitive data. Intelligence agencies have operated under this principle of “need to know access” for years. This reduces the risk of data breaches of both the accidental and intentional varieties.
Customizing detokenization output based on user or application role is one way to accomplish this. For example, loyalty applications may find a partially detokenized account number, perhaps just the last four digits of a credit card number, sufficient to do their job, while an e-commerce application would likely require a fully detokenized account number for repeat purchases. Other applications, such as business analytics, may be able to use the token itself as an identifier without any need to ever detokenize it.
Historically, the protection of credit and debit card numbers, both for payment as well as non-payment processes, has been the main application for tokenization, but the largest opportunity going forward is the general protection of sensitive data. With the costs of recovering from a data breach spiraling out of control, the case for tokenization in the enterprise is an easy one to make.
The post #cybersecurity | #hackerspace |<p> Making a Case for Tokenization in the Enterprise <p> appeared first on National Cyber Security.
View full post on National Cyber Security