Let g and l denote the greatest common divisor and the least common multiple, respectively, of a and b.
Then gl=ab.
∴g+l≤ab+1.
Suppose that (g+l)=(a+b)>(a+b)/4.
Then we have ab+1>(a+b)2/4, so we get (a−b)2<4.
Assuming, a≥b, we either have a=b or a=b+1.
In the former case, g=l=a and the quotient is (g+l)=(a+b)=1≤(a+b)=4.
In the later case, g=1 and l=b(b+1), so we get that 2b+1 divides b2+b+1.
∴2b+1 divides 4(b2+b+1)−(2b+1)2=3 which implies that b=1 and a=2, a contradiction to the given assumption that ab > 2.
This shows that (g+l)=(a+b)≤(a+b)/4.
Note that for the equality to hold, we need that either a=b=2 or, (a−b)2/4 and g=1,l=ab.
The later case happens if and only if a and b are two consecutive odd numbers..... (If a=2k+1 and b=2k−1 then a+b=4k divides ab+1=4k2 and the quotient is precisely (a+b)/4.)