This is interesting and I get the notion that better prompts equate to better results and there are many situations where this applies but some of these examples are not great because they are examples of needing to do 3x the work required to get the best result. If I have to give chat gpt 3 sets of airport codes to get the correct result for airport codes why don't I just look it up myself. Same goes for the math problem. If I have to explain how to solve the problem why don't I just solve the problem.