Earlier this month, Facebook denied accusations that it suppressed conservative news stories in its Trending Topics section after a former contractor had alleged that this was a common practice. But despite claiming its innocence for a second time, the social network is now making changes to the section following an inquiry opened by the Senate Commerce Committee.

In a public press release and in a letter to Committee Chairman Sen. John Thune, Facebook General Counsel Colin Stretch wrote "Our investigation has revealed no evidence of systematic political bias in the selection or prominence of stories included in the Trending Topics feature."

Stretch added that the company's analyses had shown the rates of approval of conservative and liberal items are virtually identical in Trending Topics. He also wrote, however, that Facebook's investigation "could not exclude the possibility of isolated improper actions or unintentional bias in the implementation of our guidelines or policies."

To improve the system and "minimize risk where human judgment is involved," Facebook is updating the terminology in its guidelines, retraining its reviewers to emphasize that content decisions may not be made on the basis of politics or ideology, and adding additional controls and oversight around the review team.

As for the system itself, Stretch said Facebook will "no longer rely on lists of external websites and news outlets to identify, validate or assess the importance of particular topics." It will remove its so-called 'Media 1k' list of RSS feeds used to supplement the algorithm that generates potential trending topics.

Facebook is also scrapping its top-10 list of news outlets, which reviewers used to assign "importance levels" to Trending Topic items.

Senator Thune appeared happy with Facebook's response. In his own statement, the Senator praised the company for addressing the allegations.

Facebook's description of the methodology it uses for determining the trending content it highlights for users is far different from and more detailed than what it offered prior to our questions. We now know the system relied on human judgment, and not just an automated process, more than previously acknowledged.

Facebook has recognized the limitations of efforts to keep information systems fully free from potential bias, which lends credibility to its findings. While the committee remains open to new information on this matter, transparency - not regulation - remains the goal, so I thank the company for its efforts to acknowledge relevant facts and its recognition of a continuing need to transparently address relevant user questions.