The reasons are obvious. They want to avoid situations in which false or misleading information is posted by anonymous users, whether due to malicious intent or not. The new editorial process, currently in pilot through the German Wikipedia site, would queue edits for approval by a trusted editor. This system would still allow anyone to submit an edit, even anonymously, but it would require approval before it would actually be posted. “Trusted” editors would be able to submit content instantly.
There are of course flaws to that system. It can take days or even weeks before a change is approved, which can be a big concern when you have multiple people contributing to an article. Then you also have the issue of the sheer number of edits overwhelming those who approve the edits. Because of that, Wikipedia is asking for input for other solutions. Ultimately this boils down to a lack of trust. If Wikipedia wants to become a more trustworthy source of information, one that people can rely upon, it'll need some sort of sanity checking system in place. Can they accomplish this through moderation, or is a more complex solution needed?