Here is a question to test your intuition for logarithms. NO CALCULATORS allowed, and don’t take a long time to work out an answer in your head! Just go with your gut. After you have committed to a choice (say, by writing it down), then go get a calculator and a pencil and see if you can work out whether you were right!
On a certain input, an algorithm runs in one-tenth of a second. On the same input, an algorithm takes one and a half weeks to run. Approximately how long would an algorithm take on the same input?
- a few seconds?
- a few minutes?
- a few hours?
- a few days?
I’m pretty sure it wasn’t until quite a few years out of undergrad that I would have gotten this right.