Every once in a while, I need to explain bandwidth throughput and data transfer to someone. Many web hosts use transfer (GB per month) to measure server bandwidth usage. Other hosting companies, especially those offering colocation services, will tend to offer “95th percentile”, which is a measure of throughput (Mbits per second).
Transfer is the total amount of data sent (either incoming, outgoing, or both) in a given month. Throughput is the amount you're sending through your network pipe on a regular basis, usually by taking a sample every five minutes and averaging it over the course of the month.
Or to put it another way, transfer is when the water company bills you at the end of the month for how much you used (43,000 gallons of water between June 1st and June 30th). Throughput would be more like averaging out how much your pipe is pushing through (1 gallon per minute).
Ultimately, both measurements are the amount of data sent over time, so we can get a rough estimation of how much transfer (GB per month) can be transferred over a single committed connection speed.
Our hypothetical server is pushing data at a constant data rate of 1 Mbit per second, with no bursting (ie, we're not using 95th percentile measurement).
Starting with the following information:
8 bits equal 1 byte
2,592,000 million seconds equals 1 month
Step 1. Convert Megabits to Megabytes
Divide 1 Mbit by 8 to get Megabytes
1/8 MByte per second = 0.125 MB per second
Step 2. Change Seconds to a fractional Month (1 month = 30 days * 24 hours * 60 minutes * 60 seconds = 2,592,000 million seconds)
We now have:
1 Mbit per second =
0.125 MB per 1/2592000 month
This is just restating the measurements.
Step 3. To change the values to a full month, multiply *both* sides by 2,592,000 (we want to change the values, rather than the measurements, so we need to operate on both sides of the equation).
(0.125MB * 2,592,000) per (1/259200 * 2592000) =
324,000 MB per 1 Month
Step 3. Change MB to GB, divide by 1024 on the left:
324,000 MB / 1024 = 316GB
We now have 316GB per month of data transferred if someone sustains a constant throughput of 1Mbit/sec.
Rounding down to a nice 300GB per month, we can figure that a 10Mbit connection speed for your server will yield a maximum of 3000GB (~3TB) of transfer per month (300GB/month X 10Mbits/sec, or 300×10). A 100 Mbit uplink can yield up to 30,000GB (~30TB) of transfer if sustained at full speed.
Bursting patterns and 95th percentile measurements (where the service provider cuts out the top 5% of bursting traffic and averages only on the remaining 95%) will cause this to fluctuate a bit. After all, if you have a 100 Mbit port speed, you can burst up to a 100Mbits of throughput for a period of time, then come back down to normal. So you may send a 300MB file in about 25 seconds if you're sending at a full 100 Mbits / sec (100Mbits / 8 bits = 12.5MB/sec; 300MB / 12.5 MB/sec = 25 seconds). But you won't likely sustain that 100 Mbits after the file is sent. You'll drop back down to a slower throughput until he next large amount of data needs to be sent.
Additionally, that same 300MB file will take about 240 seconds to send at 10 Mbits/sec, and with a 1 Gbit uplink, you could potentially send it in as little as 3 second (traffic speed between routers, datacenters, and server disk read/write and processor speeds not included).