Is this a school assignment where you are supposed to get a nice technical answer or a real life problem where you want a realistic assumption?
Looking at your question, I am assuming you are unable to tell the difference between a b and B here..
Real life: A gigabit link can transfer roughly 100MB of real data in one second so you need 10 seconds for 1GB, 2000 seconds for 200GB - roughly 35 minutes.
Simplistic ideal universe 200*1024*1024*8 / 1 000 000 = 1677.721600 seconds