Edward Snowden's treasure trove of classified National Security Agency documents has forever changed how the world views privacy. But as everyone got caught up in the ongoing leaks, nobody has publically questioned exactly how Snowden got his hands on the information - until now.

One would be forgiven to think that Snowden used some elaborate hacking software to siphon top secret documents during his time as a contractor for the agency. But truth be told, his methods were much more rudimentary.

According to a senior intelligence official, Snowden used ordinary "web crawler" software typically used to search, index and back up a website. The software ran in the background while he worked, scraping the agency's systems and ultimately accessing an estimated 1.7 million files. Although it appears as though he did set specific search parameters such as which subjects to search for and how deeply to dig, it should have easily been detected.

Key to Snowden's success was the fact that he was working for the agency in Hawaii, a location that had not yet been outfitted with the latest security measures.

Had he been at the NSA's headquarters in Fort Meade, Maryland, he almost certainly would have been caught. Agency officials said systems at that location are constantly monitored and accessing / downloading a large volume of data would have been noticed.

Snowden's behavior did draw attention a few times but officials say he was able to get off the hook by providing legitimate-sounding explanations for his activity.