Wikipedia introduces AI learning to identify bad edits

Wikipedia has announced it will be rolling out an AI engine to help identify badly edited content and vandals who maliciously edit articles, in an attempt to police the open-source project. The machine-learning tools – called Objective Revision Evaluation Service (ORES) – have been collated by the community’s researchers and specialists and will go some way to automating the editing process and ensuring content is always fair and trustworthy. Additionally, it will mean more articles can be vetted, then deleted or updated if they are found to be malicious or incorrect. ORES uses APIs and trains models against edit- and article-quality assessments made by Wikipedia editors, commonly known as Wikipedians. It then creates a score based upon these individual edits and articles as a whole. The APIs test using various different…

Link to Full Article: Wikipedia introduces AI learning to identify bad edits

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!