During defragmentation simple(stupid) defragmenters copy a whole file at a time to a sufficiently large continuously empty portion of the disc, then delete the original file. Thus if by chance, a very large file occurred early in the defragmentation cycle, there might well not be enough continuous free space for it, and defragmentation will fail.
Presumably the defragmenter you are using has a safety margin built-in, which assumes that at 15% free space there is little likelihood of this situation arising.
In practice, at 5 % empty, the search for enough continous free space for every file to be moved takes a very long time, and indeed, simple defragmenters can end up in a deadly embrace, or apparently 'seize-up' at this level.
To solve the problem, you could make back-ups of large files manually to some other media (make VERY sure the backup is safely readable and exact), then delete the larger files yourself. When below above the 15% free space limit, you can defragment much more quickly, and finally (if you must) copy the backed-up files back, where they will be automatically defragmented at the 'top' of you disc.
None of the above applies to professionally-designed defragmenters, but then, who said MS has professional programmers? In reality the defragmenter shipped with Win XP for instance is a cut-down version of a commercial product.