OFcom, the UK integrity organizer, has published another new guidance project where the online safety law (OSA) continues-the latest set of recommendations aims to support companies in the range to meet legal obligations to protect women and girls from the Internet threats such as harassment and bullying, hate women , And abuse of intimate images.
The government said that protecting women and girls is a priority to implement it for OSA. Some forms (mostly) abuse women – such as sharing intimate images without approval or the use of artificial intelligence tools to create a deep porn targeting individuals – explicitly in the law Employment priorities.
The online safety list, which was approved by the UK parliament again September 2023He criticized that the matter is not due to the task of reforming the platform giants, although major sanctions contained non-compliance-up to 10 % of a global annual rotation.
The child’s safety activists also expressed his frustration with the period that the law takes to implement, as well as doubts if it will have the required effect.
In an interview with BBC In January, even the Minister of Technology, Peter Kyle – who inherited the legislation from the previous government – described it as “very uneven” and “unsatisfactory”. But the government sticks to this approach. Part of OSA can be returned to the long -term ministers allowed to implement the system, which requires Parliament to approve the compliance instructions for Offcom.
However, enforcement is expected to start soon with regard to basic requirements on illegal content processing and child protection. Other aspects of OSA compliance will take longer. Offcom admits that these latest recommendations will not become fully implemented until 2027 or later.
It approaches the implementation line
“The first duties of the online safety law enter into force next month.” “So we will carry out some basic online safety duties that work before this direction (in itself they become implemented).”
The new guidance project aims to keep women and girls on the Internet to complete broader guidelines at a broader time on illegal content – which also provides recommendations to protect the palace from seeing the content of adults online.
In December, the organizer published his final guidance on how to shrink platforms and services Risks related to illegal contentIt is an area where the child’s protection is a clear priority.
It was also produced before Child safety codeThat recommends online services to request age checks and filter content to ensure that children are not subject to inappropriate content such as pornography. As it was working to implement the online safety system, it was also developed Recommendations for age guarantee techniques for adult content sitesIn order to push pornographic sites to take effective steps to prevent the palace from reaching the inappropriate content.
The latest collection of directives has been developed with the help of victims, survivors, women’s advocacy groups and safety experts, for all Offcom. It covers four main areas where the organizer says that females are not affected by the online damage – that is: women hate online; Pak and harassment online; Home abuse online; And abuse of intimate images.
Safety by design
Offcom’s upper recommendation urges services and platforms in the range to follow the “design by design”. Smith told us that the organizer wants to encourage technology companies to “retreat” and “thinking about the user experience in the tour.” Although she acknowledged that some services have put some useful measures in reducing the risks over the Internet in this field, she said that there is still a lack of comprehensive thinking when it comes to determining the priorities of women and girls.
“What we really ask is just changing a step in how to make design operations,” she told us, telling that the goal is to ensure that safety considerations are in the design of the product.
The emergence of photo generation services, which have noticed that they led to “tremendous” growth in abuse of intimate images Devilic as an example of the place where technicians could have taken pre -emptive measures to curl the risks of their tools that are weapons to target women and girls – yet he did not He does.
“We believe that there are reasonable things that services can do in the design phase that would help address the risk of some of these damages,” she suggested.
Examples of “good” practices include prominent practices in OFC
- Dear virtual geographical location (to reduce the risks of privacy/prosecution);
- Do the “abuse” test to determine how to abuse the service/abuse of its use;
- Take steps to enhance account security;
- Design in user claims that aim to make stickers think twice before publishing offensive content;
- Providing access to a possible reports to tools that allow users to report problems.
As with all OSA instructions for Offcom, every procedure will be suitable for every type or size of the service – because the law applies to large and small Internet services, and abandons various squares of social media, to dating online, games, forums and applications Correspondence, to name a few. So a large part of the work of companies in the range is to understand the meaning of compliance in the context of their products.
When asked if Offcom has defined any services currently fulfilling the directive standards, Smith suggested that they did not. “There is still a lot of work to be done throughout the industry,” she said.
It also implicitly confessed that there may be increasing challenges, given some of the pillars that have been taken for confidence and safety by some of the main players in the industry. For example, since the seizure of Twitter and the rename of the social network as X, Elon Musk softened the number of employees on its confidence and safety – in favor of following up what he did as a maximum approach to freedom of expression.
In recent months, it seems that the Meta-which has Facebook and Instagram-has taken some simulation steps, saying that it finishes the facts of fact examining facts from thirty years in favor of publishing a “societal notes” system in the style Example.
Transparency
Smith suggested that the OFcom response to these high-level transformations-where it can risk the behavior of operators to communicate, instead of governing, online damage-focus on using transparency powers and collecting information that is placed under OSA to clarify antiquities and lead users consciousness.
So, in short, the tactic here seems to be “name and disgrace” – at least in the first place.
“Once we put the finishing touches on the guidance, we will produce a report (the market) … about who uses the guidance, who follows the steps, and the type of results they achieve for its users who are women and girls, and they really throw us:
Smith has suggested that companies that want to avoid the risk of being released publicly for poor performance of women’s safety, will be able to switch to Offes guidelines for “practical steps” on how to improve the situation of their users, and to address the risk of damage as well.
“On the UK’s platforms, the UK will have to comply with the UK law.” “This means compliance with the duties of illegal damage and the protection of children’s duties under the online safety law.”
“I think this is the place where our transparency authorities also come – if the industry changes the direction and the damage increases, this is where we will be able to highlight the relevant information with UK users, with media, with parliamentarians.”
DeepFake Technology
One of the types of damage to the Internet as OFC is explicitly enhancing its recommendations even before it actively begins to enforce OSA is the abuse of intimate images – as the last guideline project indicates that the use of the fragmentation of detecting these abusive images and removing this is far away.
“We have included additional steps in these directives that exceed what we have already identified in our symbols,” Smith pointed out, stressing that Offcom is planning to update its previous symbols to integrate this change “in the near future.”
She added: “That is why this is a way to say to the platforms that you can submit from this implementable requirements by following the steps that are determined in this direction.”
Offcom recommended the use of retail matching technology to counter abuse of intimate images due to a significant increase in this risk, for every Smith-especially with regard to abuse of the use of DeepFake images created from artificial intelligence.
She pointed out that “there was an intimate abuse in 2023 more than it was in all previous years combined,” adding that Offcom has also collected more evidence of the effectiveness of retail matching to address this damage.
The guidance draft as a whole will now be subject to consultation – with an Offic invitation to notes until May 23, 2025 – after which it will provide final guidance by the end of this year.
18 months later, Offcom will be produced and then the first reports to review the industry in this field.
She added: “We got to 2027 before we produce our first report on those who do (to protect women and girls online) – but there is nothing to stop platforms that are behaving now.”
In response to the criticism that OSA takes Offcom very long to implement it, it said it is right for the regulatory system to consult over compliance measures. However, with the management of the final procedure next month, she indicated that Offcom expects a shift in the conversation surrounding the problem as well.
She expected that “(R) will start changing the conversation with platforms, in particular,” adding that he would also be in a position to start showing progress in moving the needle when it comes to reducing the damage online.
https://techcrunch.com/wp-content/uploads/2025/02/GettyImages-1922977290.jpg?resize=1200,800
Source link