Thinking that anyone can easily get their hands on a mostly cohesive set of data after it's been rewritten 10x, is just delusional. Data storage is increasingly dense and as any layperson might infer, when you have exponential growth in miniaturization recovery after willful destruction becomes substantially more difficult.
Nor would I buy that the FBI can really do much of anything at a truly forensic hardware level, I would bet that 99% of their "forensic" efforts are basically extracting storage devices, mounting read only and cloning, then pushing a giant red button, whereby some clunky but extremely expensive analysis software designed by a contractor does it's magic by pulling and reconstructing deleted inodes, checking cache locations, etc. In some cases they can probably pop the platters out and interrogate with a custom controller to go a bit deeper, but with a thorough wipe I bet the amount of recoverable data would be in the single percent range. I'm pretty sure the brightest and best in deep system design don't head to the FBI.
The real enemies of people who want privacy are flawed software implementation, and lax operational security. That is, your wiper doesn't actually complete the job or did it poorly, or you never delete anything because your 2 TB hard drive is effectively infinite in size for your needs. As well, cloud storage is increasingly efficient and as you see with gmail, it's possible to store things basically forever at this point. You should assume that anything unencrypted in the cloud that you delete will have 3-10 copies floating around, any of which may persist for over a year.
It's possible and probable that the NSA or the CIA possess more advanced interrogation technologies, but the processes in these cases are likely very labor intensive and not scale-able, and probably only used in instances of national interests. A more likely strategy is that these agencies have spent billions of dollars designing systems to listen to traffic across the internet and utilize sophisticated machine learning to pre-emptively collect and collate data of interest. Then, when they find your computer, they just need to find data fragments (traces of web history, browser user-agent and installed plug-ins that are sent in HTTP headers) that links your data to their data, and they could fill in alot of the missing pieces.