As many of you already know, Google's Android app marketplace has its fair share of malicious apps. The company has never been quite as strict as Apple on which apps it allows on to Google Play according to many, but the company is now starting to crack down.

In fact, Google revealed recently that its new system has been in place for the past two months or so, including a series of new security measures in place for app submissions. Google has put additional algorithms to work hoping to catch bad software, as well as assigning a team of internal reviewers to analyze incoming content. It sounds as though human employees will be going hands on with apps before they go live as well.

On top of that, Google has also introduced a new rating system that it hopes will allow developers to more appropriately target users as well as help parents. In partnership with a number of ratings standards groups from various countries, Google will take into consideration things like sexual/violent content, gambling apps, and more. Developers will be forced to fill out ratings questionnaires in order to make apps available in countries that are not yet supported, with apps being given basic age ratings in those cases.

While it certainly won't clean up the malware on Google Play completely, it sounds as though it is a step towards making its marketplace a more reliable place for Android apps.