now browsing by tag
Our first cyber security round-up of 2020 details updates to NHSmail and advice from the National Cyber Security Centre on the use of Windows 7, after Microsoft officially ended support for the platform.
Three-quarters of healthcare organisations suffered a cyber-attack in 2019
New research by data security provider Clearswift suggests that more than three-quarters (67%) of healthcare organisations in the UK have experienced a cyber security incident in the past year.
The research, which surveyed senior business decision-makers within healthcare organisations, found that almost half (48%) of incidents within the sector occurred as a result of introduction of viruses or malware from third-party devices – including IoT devices and USB sticks.
The survey found that further causes of cyber security incidents included employees sharing information with unauthorised recipients (39%), users not following protocol/data protection policies (37%), and malicious links in emails and on social media (28%).
The report once again highlights the serious threat that data breaches and malicious attacks pose to health data in the UK.
Alyn Hockey, VP of product management at Clearswift, said: “The healthcare sector holds important patient data, so it is alarming to see such high numbers of security incidents occurring in the industry.
“The healthcare sector needs to securely share data across departments and organisations in order to facilitate excellent patient care.
“With the proliferation of third-party devices in this process, it’s more important than ever that the industry bolsters its cyber security efforts to reduce the risk of everything from unwanted data loss to malicious attacks and focusses on keeping patient data safe and secure.”
NHSmail updates to improve security and user experience
NHS Digital is updating NHSmail to improve cybersecurity and save some 40,000 manual work hours for staff.
Dan Jeffery, head of innovation, delivery and business operations at NHS Digital’s Data Security Centre, detailed a number of improvements being made to the NHSmail platform around security, identity verification and user experience in a blog post on 6 January.
This includes a system to automate the movement of user accounts between NHS mail organisations that Jeffrey said would lead to “millions of pounds worth of efficiency savings.”
A password synchronisation micro-service allowing users to synchronise their password from the NHS Directory to their local active directory, and behavioural and transactional analysis providing insight into user behaviour, are also in the pipeline.
Jefferey said: “NHSmail is more than just an email service. The system manages the identities of all users within the Microsoft Active Directory in the NHS and allows local administrators to manage accounts within the NHSmail portal.
“Typically, NHS organisations will manage local identities within their own Active Directory and use the NHS Electronic Staff Record for workforce management, including the on-boarding and off-boarding of employees.
“With more than 13,000 health and care organisations in England and Scotland using NHSmail and 64,000 movements of user accounts every month, the burden is real and the security implications relating to identity are acute. But that also means the opportunity for improvement is significant.”
NCSC warns against using Windows 7
The National Cyber Security Centre (NCSC) has warned the public not to use Windows 7 to access internet banking or email applications after Microsoft pulled support for the operating system last week.
NCSC, the public-facing arm of the UK’s GCHQ intelligence service, said that people running the now-outdated Windows 7 to upgrade to Windows 10 in order to avoid possible cyber security attacks.
Microsoft official ended support for Windows 7 on 14 January, meaning computers running the software will no longer receive security and other important updates.
NCSC said in a statement: “The NCSC would encourage people to upgrade devices currently running Windows 7, allowing them to continue receiving software updates which help protect their devices,” an NCSC spokesman said.
“We would urge those using the software after the deadline to replace unsupported devices as soon as possible, to move sensitive data to a supported device, and not to use them for tasks like accessing bank and other sensitive accounts.
“They should also consider accessing email from a different device.”
Almost half of respondents to the latest Twitter poll run by Infosecurity Europe, Europe’s number one information security event, admit they would be completely unaware if a cyber breach occurred in their organisation. The poll was designed to explore incident response, an area that has come under recent scrutiny following Travelex’s response to its New Year’s Eve cyber-attack, which left many of its systems down and impacted travel currency sales.
Poll suggest half of people “wouldn’t know” warning signs of cyber security incident.
Almost half of respondents to a Twitter poll run by Infosecurity Europe admitted that they would be completely unaware if a cyber security breach occurred in their organisation.
In answer to the question: “If a cyber breach occurred, how quickly could you discover it?” 47.6% conceded they simply would not know.
The poll was designed to explore incident response, an area that has come under recent scrutiny following Travelex’s response to its New Year’s Eve cyber-attack, which left many of its systems down and impacted travel currency sales.
According to Maxine Holt, research director at Ovum, this reflects a widespread issue. “Discovering a breach well after the event is usual. Uncovering breaches is not easy, but proactive threat hunting is an approach being increasingly used by organisations.
“Regularly scanning environments to look for anomalies and unexpected activity is useful, but it can be difficult to deal with the number of resulting alerts. Ultimately, effective cyber hygiene involves having layers of security to prevent, detect and respond to incidents and breaches.”
The poll also examined risk insight, asking: “What understanding do you have of your information assets?” A worrying 44.7% revealed they had “very little” understanding, with 30.7% stating they had “some” – and only 24.7% said their grasp was “comprehensive”.
Bev Allen, CISO at Quilter, said: “Many companies don’t know what or where all their information assets are. They may think they do; but if they’re wrong this leaves them vulnerable to breaches. Consistent knowledge of your assets takes effort; you need tools and systems to record what you have, you need people to follow appropriate processes, and you need to search to find out what you don’t know about and where it is. This search must be done regularly.”
The post #nationalcybersecuritymonth | Cyber security news round-up appeared first on National Cyber Security.
View full post on National Cyber Security
Source: National Cyber Security – Produced By Gregory Evans Chrome is protecting and Sonos is disconnecting, but first: a cartoon about the new big screen. Here’s the news you need to know, in two minutes or less. Want to receive this two-minute roundup as an email every weekday? Sign up here! Today’s News Don’t ignore […] View full post on AmIHackerProof.com
#school | #ransomware | Cyberattack on Morial Convention Center has little immediate effect on events there, but problems may grow | Business News
Source: National Cyber Security – Produced By Gregory Evans The Ernest N. Morial Convention Center, one of the cornerstones of New Orleans’ multibillion-dollar tourism economy, is the latest victim in a string of cyberattacks against city and state computer systems that have had serious consequences for government officials and the public. New Orleanians were left […] View full post on AmIHackerProof.com
Source: National Cyber Security – Produced By Gregory Evans Twitter is shocking and Apple is balking, but first: a cartoon about posthumous photo sharing. Here’s the news you need to know, in two minutes or less. Want to receive this two-minute roundup as an email every weekday? Sign up here! Today’s News Did Twitter help […] View full post on AmIHackerProof.com
#nationalcybersecuritymonth | ‘Shot across the bow’: U.S. increases pressure on UK ahead of key Huawei decision | News
Source: National Cyber Security – Produced By Gregory Evans Wednesday, January 08, 2020 1:06 a.m. EST By Jack Stubbs and Alexandra Alper LONDON/WASHINGTON (Reuters) – The United States is making a final pitch to Britain ahead of a U.K. decision on whether to upgrade its telecoms network with Huawei equipment, amid threats to cut intelligence-sharing […] View full post on AmIHackerProof.com
Only in Silicon Valley does a longtime tech startup founder find a second career in a chocolate-making robot.
Nate Saal studied molecular biophysics and biochemistry at Yale University after graduating from Palo Alto High School in 1990. After returning to Palo Alto, he quickly shifted from science to the internet, founding what he says was the first web-based software updating service in 1996. He went on to start more technology companies and later worked for CNET and Cisco.
But these days, he’s immersed in chocolate — specifically, chocolate made by a countertop device that he created called CocoTerra. The sleek white device, which looks like a large, futuristic coffee maker, uses algorithms, hardware and a smartphone app to transform cocoa nibs, milk powder, cocoa powder and sugar into chocolate in about two hours.
Saal has high hopes for the machine, which has yet to be released. In the age of automation, where robots are making pizza and ramen and delivering our food, he sees CocoTerra as doing something different: using technology to deepen rather than disrupt people’s connection to how their food is made.
“We’re not trying to slap technology for technology’s sake on top of that to abstract it away, to take creativity away,” he said. “We’re trying to actually create a whole new category of people who can now make chocolate.”
While Saal’s professional career has focused on technology, he has always filled his weekends with homegrown food experiments, like keeping bees and growing grapes and olives to make wine and olive oil from scratch. He’s fascinated by the “deep science” of these activities.
Making chocolate, however, was not in his repertoire. It wasn’t until he took his brother-in-law, who works in the coffee business, to a chocolate tasting several years ago, and a conversation about the similarities between the two industries got him thinking. His brother hypothesized that home coffee machines have allowed more people to understand and appreciate coffee in a way that chocolate hasn’t experienced. People did make chocolate at home, but it was a lengthy process that required having several expensive appliances, he found.
“There’s a bread machine, an ice cream maker and a juicer and a pasta maker and a tea brewer and a coffee maker — every major food category has a home appliance. What I discovered very quickly was there is no such thing (for chocolate),” Saal said.
He educated himself by going to chocolate-making classes, including a boot camp at Madre Chocolate in Hawaii. Back in Palo Alto, he and a team got to work designing a device that could combine all steps in the chocolate-making process — grinding, refining, conching, tempering and molding — in one machine. It typically grinds the single-origin cocoa nibs for about half an hour, using stainless steel balls, then refines the cocoa butter, sugar and milk powder. Conching is the “slow manipulation or agitation of chocolate at elevated temperatures to help drive off some undesired flavors,” said Chief Operating Officer Karen Alter. Named for conch shell-shaped equipment, this is part of the process is often on display during chocolate factory tours, she said, with large vats that have paddles slowly moving liquid chocolate.
The next step, tempering, involves cooling the ingredients to a specific temperature that will create a specific structure of seed crystal in the cocoa butter molecules, Saal enthusiastically explained. The crystals solidify, creating shiny, hard chocolate. A patented centrifuge inside the machine cools and spins the chocolate to remove bubbles.
The final result is a ring-shaped, half-pound mold of chocolate, rather than the traditional rectangular bar.
On the back end, technology allows a level of customization that CocoTerra’s creators hope will make the device as appealing for experts as for novices. A cloud-based recipe system, accessible online or via an app, guides you from start to finish in a recipe. People can either default to CocoTerra’s recipes, such as 62% dark chocolate or milk chocolate with almonds, or customize them, from level of sweetness and creaminess, to added flavors and ingredients, to the tempering temperature. People can easily control for allergies or dietary restrictions.
CocoTerra will sell the base ingredients directly to customers, focusing on fair trade, ethically grown nibs, or people can use their own. Those who are advanced enough to roast and shell their own cacao beans could still do that, put them into the machine and then create their own recipes.
Producing quality chocolate in two hours is “jaw-dropping” to many in the chocolate industry, Saal said.
“I thought they were totally crazy when I first talked to them on the phone,” John Scharffenberger told CNBC. Scharffenberger, who co-founded Scharffen Berger in San Francisco in 1997 before small batch, artisan chocolate was a thing, is now a CocoTerra investor and calls it “a natural extension of the craft chocolate movement.”
The company won’t disclose a price for the machine, which they claim is the world’s first tabletop chocolate maker. CocoTerra has raised more than $2 million in investments and is now focused on a larger round to fund the release of the device.
“This is about the evolution of technology to make chocolate. But it’s also making it accessible,” Saal said. “We’re bringing that to people by using smart mechanical engineering and software to make it accessible so that you can actually now focus on things like the flavor and recipe and the look and the design and the craft of it.”
View full post on National Cyber Security
Last year I wrote about how the Sophos Security Team uses a variety of data streams to help give context to its threat hunting data.
One of those data streams is from our very own Sophos Central, but we have always used an unsupported method to obtain it, until now. The Sophos Security Team is super excited to let you know that the Sophos Central API has been officially released!
This means there’s now a supported method to get tenant information from Sophos Central, and it will help provide context to other security logs you may be monitoring in your estate.
We are also sharing our Sophos Central API Connector Python Library to help you get the information quickly using your Sophos Central API keys.
Let’s dig deeper into how the data is used and obtained.
About the API
There are several steps required to begin querying endpoint and event information from the Sophos Central API. You will need to create and securely store a client ID and client secret to access the API for your tenant(s). We can’t stress enough how important it is to store these keys securely.
Here’s the basic concept of the authorization process:
- Authorize and obtain a bearer token for OAuth2 using your client ID and client secret.
- Authenticate with the whoami api to get your partner, organization or tenant ID using the bearer token.
- If you are a partner or organization, you can obtain all your tenant ID information for your different estates using the specific API.
Once you have your tenant IDs and their associated data region API host, you can begin to get endpoint or event data for those tenants. In this article we’ll focus on two APIs: GET /alerts and GET /endpoints.
The Endpoint API focuses on querying computer and server endpoints. It allows you to perform routine actions on them such as gathering system information, performing or configuring a scan, gathering or changing the tamper-protection state, triggering an update, or deleting an endpoint. When using the GET /endpoints path this will get all the endpoints for the specified tenant.
The Common API is the interactive alert management for open alerts and allows you to act on them. The GET /alerts functionality, which is part of the Common API, fetches alerts which match the criteria you have specified in the query parameters.
Once you have the allowed actions from the alert, you can post to perform an action for that event. Alternatively, there is a path to post a search for specific event criteria, or search for alerts for a specific endpoint ID.
For information on how to create your API keys and more detailed information on the APIs themselves, have a look at the Sophos Central API developer site.
All of this is important to know, but how does the Sophos Security Team obtain and use this data?
What we use the data for
The information obtained from Sophos Central API, coupled with other security/applications logs in our SIEM, allows us to enrich our security use cases. This lets us pinpoint the more serious events and swiftly act on these.
It also aids automation, allowing the flows to act on events and obtain more information from Central on a specific given device. This offers greater insight to the health state of the machine. Not only that – given the alert type, you can clean or delete detections, trigger a new scan, or see which systems you need to focus on in an incident.
We plan to offer even more data and functionality over the coming months. I would encourage you to keep an eye on our What’s New page for further announcements.
Sophos Security Team Central API Connector Library
Our goal when developing the API Connector Library was to make it easy for our team to utilize the Sophos Central API in our various security use cases.
We then realized the library would also be useful for you, our customers, to help you begin ingesting data into your SIEM, or simply obtaining the data so you could see what you could do with it.
So that’s exactly what we have done! The library is now available. You can access it from:
- PyPI – pip install sophos-central-api-connector
Alongside the library, we have a sophos_central_main.py which has been written to get the inventory or alert data from Sophos Central API using the CLI.
There are four output options available using the CLI:
- stdout: Print the inventory information to the console.
- json: Save the output of the request to a json file.
- splunk: This will send the data to Splunk with no changes made and apply the settings from the token configuration.
- splunk_trans: Using this output will apply the information set in the splunk_config.ini for the host, source, and sourcetype. This will override the settings in the token configuration. However, it will not change the Index that the data should be sent to.
I will cover the functionality with an example command, but first we need to cover the different config files it uses.
The majority of the variables contained in this config file must remain static to maintain the correct functionality of the Sophos Central API Connector. However, there are two variables which can be changed if you’d prefer default behavior to be different.
DEFAULT_OUTPUT: This variable is set to ‘stdout’ so if no output argument is passed to the CLI, results will be returned to the console. You can change this to be another valid value if desired.
DEFAULT_DAYS: This variable is set to ‘1’ if no days argument is passed in certain scenarios. This default is also used for the default number of days passed for polling alert events. More on this to follow below.
While you can set static API credentials in this configuration, we strongly advise that this is only done for testing purposes. Where possible, use AWS Secrets Manager to store your credential ID and token.
You can access your AWS Secrets by configuring your details as below:
The page size configuration is the number of events you would like to appear per page when querying the Sophos Central API. You may specify maximum page sizes, which will be checked during the execution of the connector. If these pages sizes are left blank, the default page sizes will be used as determined by the API.
This config is solely for admins who are sending the alerts and inventory directly to Splunk. There are options for both static token information as well as an option to use the AWS Secrets Manager. We would recommend that the static entry option is only used for testing purposes and the token is stored and accessed securely.
Information on how to enable and setup the Splunk HTTP Event Collector can be found in the HTTP Event Collector documentation.
Once you have set up your config files, you can start see what data you have.
To display syntax help information:
‘python <path to file>/sophos_central_main.py --help’
To get your tenant information:
‘python <path to file>/sophos_central_main.py --auth <auth_option> --get tenants’
To get inventory data:
‘python <path to file>/sophos_central_main.py --auth <auth_option> --get inventory --output <output_option>’
If you wish to just get the inventory for one specific tenant, then the syntax is the following:
‘python <path to file>/sophos_central_main.py --auth <auth_option> --get inventory --tenant <tenant_id> --output <output_option>’
You can use the tenant ID displayed when the get tenant query was run.
As with the option for “get inventory”, you can retrieve alerts for a specific tenant or all tenants. In addition, you can specify the number of days’ worth of alerts you would like to pull back by using the days parameter.
Sophos Central holds event data for 90 days, so when passing the days argument, you can specify days as an integer from 1 to 90. If no argument for the number of days is passed, a default of one day is set, or to whatever was set in the ‘default_days’ in the sophos_central_api_config.py file.
To get the alert data run:
‘python <path to file>/sophos_central_main.py --auth <auth_option> --get alerts --days <integer: 1-90> --output <output_option>’
Because alerts could come into Central at varying times depending on when the machine sends the information back, we needed a way to see what alerts had already been sent to our SIEM. When passing the polling option, a list of successful events will be maintained to prevent duplicates from being sent to the SIEM.
To run the polling option:
‘python <path to file>/sophos_central_main.py --auth <auth_option> --get alerts --days <integer: 1-90> --poll_alerts --output <splunk or splunk_trans>’
There is no polling option for the “get inventory” functionality, as the data for all systems should be returned to obtain a full inventory. This is because the data for each machine can change each time the CLI is run, or simply get specific endpoint id inventory data if required.
Why the Sophos Security Team is excited about Sophos Central API
We love the flexibility Sophos Central API offers, and how it allows us to bring more context to our other logs. We’ve been able to instantly get an idea of the host health and whether there have been any recent detections. Plus, alerts and devices are really easy to maintain from Central.
It’s safe to say that the Security Team has given the API a big thumbs up already, and we hope that you find the Sophos Central API Connector Python Library useful too.
Keep an eye out for more features in the future as Sophos Central API continues to be updated.
The post Unlocking the power of Sophos Central API – Sophos News appeared first on National Cyber Security.
View full post on National Cyber Security
Technology is at the core of whatever Amazon does — from algorithms that forecast demand and place orders from brands, and robots that sort and pack items in warehouses to drones that will soon drop packages off at homes.
At its new Go Stores, for instance, advances in computer vision have made it possible to identify the people walking in and what products they pick up, helping add them to their online shopping carts.
Jeff Bezos, the founder of Amazon and the world’s richest man, is always pulling new rabbits out of his hat, like next-day or same-day shipping and cashier-less stores. Besides, there is Blue Origin, the aerospace company privately owned by Bezos, which is on a mission to make spaceflight possible for everyone.
Be that as it may, a lot more disruption aimed at reaching the common man is on the anvil.
The most far-reaching and impactful technologies being developed today are for Amazon’s own use, but some others have the potential to disrupt every sector.
The technology marvels that Amazon Web Services — the largest profit driving unit in Bezos’ stable — is working on could jolt several industries, including in India, in the same way that Amazon once disrupted retail.
“In retail, while things like the size of the catalogue, advertising and other stuff might play a role in success, at Amazon, I think success is largely technology driven,” said Chief Technology Officer Werner Vogels.
The ecommerce giant is using advances in technology to disrupt several sectors outside of retail though — medicine, banking, logistics, robotics, agriculture and much more. Interestingly, some of that work is happening in India.
Initially, the thinking was around allowing enterprises in these sectors to grow by using its cloud storage and computing capabilities.
Now, Amazon’s reach has become more nuanced and it has moved up the value chain.
For example, no longer is Amazon offering banks a place to securely store information, it is going beyond by offering tools to detect fraud, making it unnecessary for the lenders to build expensive data science teams in-house.
It is a similar story in other industries, made possible due to the massive amounts of data that Amazon collects and processes.
“We give people the software capability, so they no longer need to worry about that side of things. Most of our services are machine learning under the covers (and) that’s possible mostly because there’s so much data available for us to do that,” Vogels said.
Hospitals in the United States have to save imaging reports for years. Earlier these were stored on tapes, since doing so digitally cost millions of dollars.
The advent of cheaper cloud storage meant new scans could be saved digitally, making them accessible to doctors on demand.
Now, doctors could refer to a patient’s earlier CT scan and compare that with the new one to diagnose an ailment, said Shez Partovi, worldwide lead for healthcare, life sciences, genomics, medical devices and agri-tech at Amazon.
The power of cloud and AWS’ own capabilities in medical technology have only expanded since.
Healthcare and life sciences form rapidly scaling units of AWS, which is building a suite of tools that allow breakthroughs in medicine — from hospitals using the tools to do process modelling or operational forecasting, refining the selection of candidate drugs for trial or delivering diagnoses through computer imaging.
Developed markets will be the first to adopt such technologies, but AWS is seeing demand surge from the developing world, including India.
“Not everyone is within a mile of a radiologist or physician, so diagnostics through AI could solve for that. Further, there’s a lack of highly trained people, but when all you have to do is take an image, it requires a lot less training,” said Partovi.
Bezos, in his private capacity, is now looking to connect remote regions with high-speed broadband. He is building a network of over 3,000 satellites through “Project Kuiper”, which will compete with Elon Musk’s SpaceX and Airbus-backed OneWeb.
The bigger bet is in outer space though. His rocket company Blue Origin has already done commercial payloads on New Shepard, the reusable rocket that competes with SpaceX’s Falcon 9. The capsule atop the New Shepard can carry six passengers, which Bezos looks to capitalise on for space tourism, a commercial opportunity most private space agencies are looking at.
It is also building a reusable rocket – Glenn, named after John Glenn, the first American to orbit the earth — which can carry payloads of as much as 45 tonnes in low earth orbit.
Bezos’ aim, however, is to land on the Moon. His Blue Moon lander can deliver large infrastructure payloads with high accuracy to pre-position systems for future missions. The larger variant of Blue Moon has been designed to land a vehicle that will allow the United States to return to the Moon by 2024.
Amazon’s take on robotics is grounds-up.
The company has been part of an open-source network that is developing ROS 2 or Robot Operating System 2, which will be commercial-grade, secure, hardened and peer-reviewed in order to make it easier for developers to build robots.
“There is an incredible amount of promise and potential in robotics, but if you look at what a robot developer has to do to get things up and running, it’s an incredible amount of work,” said Roger Barga, general manager, AWS Robotics and Autonomous Services, at Amazon Web Services.
Apart from building the software that robots will run on, AWS is also making tools that will help developers simulate robots virtually before deploying them on the ground, gather data to run analytics on the cloud and even manage a fleet of robots.
While AWS will largely build tools for developers, as capabilities such as autonomous navigation become commonplace, the company could look to build them in-house and offer them as a service to robot developers, Barga said.
With the advent of 5G technology, more of the processing capabilities of robots will be offloaded to the cloud, making them smarter and giving them real-time analytics capabilities to do a better job. For India, robot builders will be able to get into the business far more easily, having all the tools on access, overcoming the barrier of a lack of fundamental research in robotics.
AWS might be a behemoth in the cloud computing space, but cloud still makes up just 3% of all IT in the world. The rest remains on-premise. While a lot will migrate to the cloud, some will not. In order to get into the action in the on-premise market, Amazon has innovated on services that run on a customer’s data centre, offering capabilities as if the data is stored on the cloud.
With Outposts, which was announced last month, AWS infrastructure, AWS services, APIs, and tools will be able to run on a customer’s data centre.
Essentially, this will allow enterprises to run services on data housed within their own data centres, just like how they would if it had been stored on AWS.
The other big problem that AWS is looking to solve is not having its own data centres close enough to customers who require extremely low-latency computing. For this, the company has introduced a new service called Local Zones, where it deploys own hardware closer to a large population, industry, and IT centre where no AWS Region exists today.
Both these new services from AWS could be valuable in India given the lower reach of cloud computing among enterprises as well as stricter data localisation requirements.
Artificial Intelligence/Machine learning
Amazon is moving up the value chain in offering services backed by artificial intelligence and machine learning to automate repetitive tasks done by human beings.
Enterprise customers will simply be able to buy into these services with minimal customisation and without a large data science and artificial intelligence team.
In December, AWS launched its Fraud Detector service that makes it easy to identify potentially fraudulent activity online, such as payment fraud and creation of fake accounts. Even large banks in India have struggled to put together teams to build machine learning models for fraud detection, but with such a service they can train their systems easily.
Code Guru is another service that uses machine learning to do code reviews and spit out application performance recommendations, giving specific recommendations to fix code. Today, this is largely done manually, with several non-technology companies struggling to build great software for themselves due to bad code.
Transcribe Medical is a service that uses Amazon’s voice technology to create accurate transcriptions from medical consultations between patients and physicians. Medical transcription as a service is a big industry in India, and India’s IT service giants hire thousands to review code. These services are expected to replace mundane manual tasks, freeing up resources for sophisticated tasks, and could lead to disruption.
The post #deepweb | <p> Jeff Bezos’ big tech bets, Technology News, ETtech <p> appeared first on National Cyber Security.
View full post on National Cyber Security