How Thomson Reuters is leveraging AI to enhance productivity
Thomson Reuters is a venerable news and information organization, with its historical roots stretching all the way back to the 19th century. The two companies merged in 2008 and provide a combination of news and specialized information in areas like law, trade and accounting.
The organization processes a ton of information every year, relying on a staff of 27,000 subject experts and journalists around the world to generate a variety of content. As generative AI has emerged in recent months, it would surely be tempting to use it in the newsroom, as other news organizations have done, and see this capability as an opportunity to reduce staff, cut costs and automate, automate, automate.
While the company sees the benefits of AI for both its employees and customers, it is not in the worker replacement camp, at least not yet. Instead, it sees AI as a way to help customers find information faster, and help its employees operate more efficiently, removing the mundane parts of the job so people can do what they do best.
It would be easy to think that an organization as old as Thomson Reuters would simply dismiss technology like generative AI, but the company tells TechCrunch+ that it is all in when it comes to the latest technology, as it looks for ways to improve and modernize its operations.
The people part
Chief people pfficer Mary Alice Vuicic, says Thomson Reuters sees automation as only part of the story, and if you concentrate on that, you may miss some of AI’s biggest benefits.
“We think AI is a phenomenal opportunity for the professionals we serve through our products, and equally internally for our colleagues,” Vuicic told TechCrunch+. “We think it’s a tool for augmenting the potential of our colleagues in new ways, helping them do work better, faster, more effectively.”
That said, she also recognizes that large language models (LLMs) do not always provide perfect answers, and Thomson Reuters is already relying on internal expertise to help correct the models.