The competition in this field was intense, emerging different architectures and methods for production. One of the main interests for the industry is the size of the nodes, with now 14nm as the state of the art for some well-known producers, although there are node demos of even lesser size.
There are who thinks in 2020 the node size will be 5nm, so the path continues from those revolutions of the past, with 10 micrometers in the year 1971! If you are not familiarized with SI units, one nanometer is 1/1.000.000.000 meters and micrometers 1/1.000.000. This means a ratio close to 1:1000 in forty years.
However, if we are in the 2014 with 14nm and the flag is set to 5nm by 2020, the ratio for six years is close to 1:3, so sixty years at this rhythm would be 1:30. Not the characteristic exponential curve of 1:1000 in less years!
An Intel co-founder called Gordon E. Moore did present in 1965 an observation about this subject, stating that the number of transistors will be doubled approximately every two years. If you double one hundred you will need another hundred transistors, but doubling one hundred trillion is not as easy as to say «umbrella».
Reducing node size usually comes with great advantages like less energy consumption. Processing power also with other subsequent setbacks like resource waste, sadly common in abundance scenarios. Some operating systems need right now critical amounts of time for doing simple tasks, due to multiple levels of abstraction to make software development easier. Software could also employ tricks to deliberately consuming resources without necessity, giving users the signal to update their devices, producing more revenue for hardware sellers and inherent software (bundled, OS market).
There was a study somewhere that said six megapixels is a good choice for general purpose photography. What about processing power for an average user?