Upon searching for “brick", I was met with a page of images typical to what you’d expect when searching for pictures via Google; however, upon utilizing “similar images”, its level of accuracy and usefulness was astonishing to say the least. When I requested images similar to a single up-close brick on a white background, about 85% of the first page’s results mimicked the designated image very closely, with the remaining hits sharing at least some characteristics.
Similar precision was portrayed with more detailed images, as well. When searching for “Bill Gates”, prior to filtering anything, the first page was mostly scattered images of the Chairman, with one image unrelated. Requesting “similar images” to a snapshot of Bill Gates sitting at a desk with several displays, windows and a plant in the background resulted in the same exact image hosted all around the web occupying half of the first page; however, the other half was a long shot, being comprised of group office pictures and vehicles.
Relying on automated image recognition as well as metadata, the technology is intended to spare users the hassle of reentering a query; but for now it's fair to call it a work-in-progress. You can try your own searches here and let us know how it goes in the comments.