Big Iron
Back in the late 80's/early 90's, I used to argue that programmers should do their coding on an 8086 machine, an IBM XT for example, rather than something more powerful like a 286. My argument was that by using a slow machine, you had the same user experience as your average user, and you could optimize the program appropriately.
These days it feels as though we have more power than needed, even with old machines. I am typing this on a Microsoft Surface 3 running Linux - a machine that I can code on, surf the web on, do reasonably complicated maths on - nearly everything that I need to do. Even my phone is probably powerful enough for most purposes.
Nowadays it feels as though it's only games, deep learning applications, and simulations that really are pushing the hardware to the limit.