Need

now browsing by tag

 
 

#hacking | XSS vulnerability in CKEditor prompts need for Drupal update

Source: National Cyber Security – Produced By Gregory Evans


John Leyden

20 March 2020 at 14:20 UTC

Updated: 20 March 2020 at 14:29 UTC

Text editor flaw spawns CVE

A vulnerability in a third-party library component has had a knock-on effect on software packages that rely on it, including the Drupal content management system.

The issue involves a cross-site scripting (XSS) bug in CKEditor, a rich text editor that comes bundled with various online applications.

An attacker might be able to exploit the XSS vulnerability to target users with access to CKEditor. This potentially includes site admins with privileged access.

Exploitation is far from straightforward and would involve tricking potential victims into copying maliciously crafted HTML code before pasting it into CKEditor in ‘WYSIWYG’ mode.

“Although this is an unlikely scenario, we recommend upgrading to the latest editor version,” developers of CKEditor explain in an advisory, issued earlier this month.

CKEditor 4.14 fixes this XSS vulnerability in the HTML data processor, discovered by Michał Bentkowski of Securitum, as well as offering featuring improvements and resolution for an unrelated XSS vulnerability in the third-party WebSpellChecker Dialog plugin.

An advisory from Drupal, issued on Wednesday, instructs users to update to a version of the CMS that feature the updated version of CKEditor in order to mitigate the vulnerability.

In practice, this means upgrading to either Drupal 8.8.4 or Drupal 8.7.12.

The security flaw is described as “moderately critical” by Drupal, even though attackers would need to be able to create or edit content in order to attempt exploitation.

READ MORE WordPress Terror: Researchers discover a massive 5,000 security flaws in buggy plugins

Source link

The post #hacking | XSS vulnerability in CKEditor prompts need for Drupal update appeared first on National Cyber Security.

View full post on National Cyber Security

#deepweb | Why we need more women to build real-world AI products, explained by science

Source: National Cyber Security – Produced By Gregory Evans

Before we dive into why more women should lead AI teams, I want to share a fascinating story I heard from Tania Biland, a 3rd-year student of Lucerne University of Applied Sciences and Arts.

The story as narrated by Tania:

Last semester, our class got split into three different groups in order to develop a safety technology solution for Swiss or German brands:

Group 1: Only women (my group)

Group 2: Only men

Group 3: Four women and one man

After 4 weeks of work, each team had to present their work.

Group 1, composed of only women, developed a safety solution for women in the dark. As the jury was only male we decided to tell a story using a persona, music, and videos in order to make them feel what women are experiencing on a daily basis. We also put emphasis on the fact that everyone has a mother, sister, or wife in their life and that they probably don’t want her/them to suffer. In the end, our solution was rather simpletechnologically: using light to provide safety but connected to the audience emotionally.

Group 2, mostly composed of men, presented a more high-tech solution using AIGPS, and video conferences. They based their arguments on facts and numbers and pointed out their competitive advantages.

In Group 3, with 4 women and 1 man, the outcome didn’t seem finished. The only man in the group could not agree to be led by women and they, therefore, spend too much time discussing group dynamics instead of working.

The groups not only had different outputs but also approached the problem differently. My group (group 1) decided to start by defining each other’s work preferences and styles in order to distribute some responsibilities and keeping a hierarchy as flat as possible.

On the other hand, the two other groups elected a leader for the team. It turned out that these “leaders” were more perceived as dictators, which lead to heavy conflicts where the teams spent hours discussing and arguing while our group was just working and productive.

What science tells us about gender differences

The science landscape with regards to gender differences and effects on behavior is still evolving and has not come up with a clear set of scientific explanations for different behaviors yet. By compiling most of the research, there are two main factors that influence behaviors:

  1. Potential physiological differences between men and women
  2. Social norms and pressures forming different behaviors

In the above story, as told by Tania, women developed the solution in a Collaborative Leadership Style (adhocracy culture), adapting the leading position based on the tasks with an almost flat hierarchy. They derived their argumentation by involving all stakeholders (in this case the mothers and wives = users), showing empathy for their problems. They saw the bigger picture and also built a simpler solution that was actually finished.

Through the story, I was able to connect the dots on why most AI projects never end up moving out from the prototype phase to a real-world application.

Why AI products are not adopted?

Based on my experience, there are three main reasons why most AI and Machine Learning (ML) solutions do not move from the prototyping phase to the real-world:

  1. Lack of trust: One of the biggest difficulty for AI or ML products is lack of trust. Millions of dollars have been spent on prototyping but with very little success in the real-world launches. Essentially, one of the most fundamental values of doing business and providing value to customers is trust, and Artificial Intelligence is the most-heavily debated technology when it comes to ethical concerns and related trust issues. Trust comes from involving different options and parties in the entire development phase, which is not done in the prototype phase.
  2. The complexity of a launch: Building a prototype is easy, but there are tens of other external entities that need to be considered when moving into the real world. Besides technical challenges, there are other areas of focus that need to be integrated with the prototyping (such as marketing, design, and sales).
  3. AI products often do not take into account all stakeholders: I heard the story that Alexa and Google Home are being used by men to lock out their spouses in instances of domestic violence. They are turning up the music really loud, or they are locking them out of their homes. It is possible that in an environment with mostly male engineers building these products, no one is thinking about these kinds of scenarios. Additionally, there are many instances about how artificial intelligence and data sensors can be biased, sexist and racist [1].

Interestingly, none of the three points relate to the technical challenges, and all of them can be overcome by creating the right team.

How to make AI more successfully adopted?

In order to solve the above challenges and build more successful AI products, we need to focus on a more collaborative and community-driven approach.

This takes into account opinions from different stakeholders, especially those who are under-represented. Below are steps to achieve that:

Step 1. Involve different groups esp. women from the middle of the talent pyramid

In technology, most companies focus on hiring people at the top of the talent pyramid, where for primarily historical reasons, are fewer women. For example, most Computer Science classes have less than 10 percent of women. However, many talented women are hidden in the middle of the pyramid, educating themselves through online courses but lack opportunities and encouragement.