Amid to the various criticisms of Facebook's ad tools, and its lack of transparency over data usage, last year, UK-based consumer advice personality Martin Lewis filed a lawsuit against The Social Network over the abundance of scam ads users were being shown in their feeds.
As per Lewis' (in April last year):
"[Facebook] is facilitating scams on a constant basis, and in a morally repugnant way. If Mark Zuckerberg wants to be the champion of moral causes, then he needs to stop his company doing this.”
Earlier this year, however, Lewis dropped his case, after agreement with Facebook on the implementation of two dedicated measures:
- Facebook would implement a new scam ads reporting tool in the UK
- Facebook would donate £3 million ($US3.7m) to set up the Citizens Advice Scams Action (CASA) service in order to provide additional support and guidance on scams to UK users
This week, Facebook has come through on both of these pledges, implementing a new anti-scam addition in the UK - though it's not significantly different from what's currently available elsewhere.
The new, on-platform ad reporting option provides an additional way for users to submit more specific feedback on any scam ads they see in their feeds.

As you can see here (though the image is small), when you tap on the three dots in the top right of an ad, you're given the various action options. If you choose 'Report Ad', you can select 'Misleading or Scam', which then enables you to submit a report for Facebook to review.
At this point, there's no difference to the current reporting flow, which is available to all users for scam ads - as you can see here:

The new, UK-specific addition is in the last frame - users will now have a fourth reporting option: 'Send a detailed scam report'.
When selected, users will be able to manually enter additional, specific details on the scam in question, and that report will be assessed by "a new, dedicated, specially trained internal operations team who will handle these reports, and review and take down violating ads".
"That team will also investigate trends to help enforcement, and drive improvements. The tool and dedicated team are unique to the UK, as a result of the lawsuit."
So it's not so much a whole new reporting option, but Facebook has committed to implementing a team to deal specifically with such issues, while users will have more capacity to share their concerns and issues via the new reporting tool.
That, along with the donation to CASA, will ideally help lessen the impact of scams on the platform. According to TechCrunch, CASA will provide "one-on-one help to those worried they’re being scammed or who have already lost money as a result of fake ads". CASA will also implement new education and awareness programs to help UK users avoid scams.
It's a comparatively small functional addition - and there's no word on how much, if any, additional resources Facebook is putting into dealing with these more detailed scam complaints. But given the widespread usage of Facebook, and its usage among more vulnerable people in particular, it's good to see the company agreeing to take extra steps to address such concerns, and deal with potential issues.
This also comes in addition to Facebook recently launching new tools to help people understand why they're being shown certain ads on the platform, including access to more specific insights on the ad targeting measures being used, including whether their info was gleaned from third-party data providers.

The increased transparency will help Facebook fend off concerns about its practices, providing consumers with more tools to understand how their data is being used for ad targeting. Whether they actually use such tools or not is another question, but one that's no longer on Facebook - which could help alleviate potential regulatory pressure on the company.
That's really the key question, and where this new UK initiative looks promising - by not only adding new options on platform, but by also empowering third-party groups to help educate people on such scams, Facebook may help to improve the outcomes for vulnerable users. There's only so much Facebook can, and will, do to prompt users on its platform, but if third party providers have more resources and capacity to educate, that may provide better outcomes.