Boffins fear we might be running out of ideas
A new article appeared claiming that it now takes more scientists to continue Moore's law than it did years ago. The article claims research is not as effective as it used to be. The article claims ideas are harder to find and an idea drought is taking place.
The boffins observe that exponential growth is getting harder to achieve because we have to double our research efforts every 13 years in order to maintain constant expansion.
This applies beyond the semiconductor industry. The authors found that research productivity for agricultural crop yields are declining at a rate of about 5 per cent annually, and noted a comparable rate of decline in the mortality improvements associated with cancer and heart disease.
Does this article also consider that people are reproducing and that in 13 years there are also going to be more people? It is possible there could be more scientists in 13 years to make up for the fact that creating new semiconductors is becoming harder. Crowd mechanisms can also take effect so that perhaps it wont matter how many people were involved in coming up with the new idea or doing the research? In away I see the article as being a bit biased and having assumptions built in.
They do make a point that it will cost more resources to get the same level of progress going into the future. I would think this is going to be unavoidable at least until AI becomes a game changer. AI may have the effect of dramatic productivity increases in many areas which we cannot yet predict so it's extremely difficult to speculate on future productivity.
I think one of the biggest barriers to innovation the last couple decades is the fact that we really do not spend enough money on R&D and the sciences in particular. If you look at a long term innovation trend, it is easy to see we are far below the norm. This, of course, is only getting worse since we have an administration that decided to cut/eliminate anything pertaining to science including funding.
AI will be a game changer along with quantum computing. AI is growing by leaps and bounds; quantum is a decade away at least.
Great points brought up.
@originalworks
The @OriginalWorks bot has determined this post by @dana-edwards to be original material and upvoted it!
To call @OriginalWorks, simply reply to any post with @originalworks or !originalworks in your message!
To nominate this post for the daily RESTEEM contest, upvote this comment!
For more information, Click Here!
Does government have to fund it or can't we fund it directly by crowdfund?
I am sure crowdfunding could be used to offset the loss of government research.
The problem is that much of the government's spending is on foundational research which doesnt result in anything but does push the ball down the field. It is rather easy to focus upon the last 1% when a company/individual takes existing research and alters it in a way that one has a marketable product. Completely overlooked is the other 99% that was forged by many individuals using dozens if not hundreds of government grants to get the research to the point where it could be turned into a viable product or service.
Part of the incredible explosion in breakthroughs over the years in the US was the research done upper levels educational system and smaller companies that were funded through the DOE or DOD. Sadly, these monies have repeatedly been cut over the years.
This is why I like ICOs, it does fund research.
Yes in this arena it does, without a doubt.
Sadly, this far, it hasnt entered the realms like biology and chemistry (although I am not too worried about chemistry because that is the first area that will go quantum).
AI certainly will help us make up the ground. There are breakthroughs everywhere. However, a bit more support would result in a faster time to the innovations reaching the market.
Interesting post... I think rather than the ideas themselves, the institutions and culture of risk-aversion and short sighted budget allocations are keeping truly original ideas from being realised .
There are few places left like Bell labs, a place with interdisciplinary teams and no quarterly targets to worry about. That left them free to explore and try off the wall ideas. Of course those were high risk and rarely paid off ... but when they did it sparked whole industrial revolutions ... I am very curious what Google's X as going to produce ... they might be one of the few places left with serious budget and an appetite for trying seemingly wacky ideas .
Academia and huge multinationals probably only will generate incremental improvements ...
I’ve done some of my own research on the subject & would love to provide my opinions based on what I know/have learned. Granted, some may disagree with the conclusions I’ve drawn and I would like nothing more than to discuss it further to hear your input. Moore’s Law has recently waned due to several complex technological difficulties for optimizing transistors and integrated circuit performance. Despite this fact, I believe that it it possible to maintain Moore’s Law up until at least 2030. Improvements in computing architecture, the material composition of transistors, and related costs to produce these computers will be the driving factors in extending Moore’s Law & the computational consequences that follow.
With regard to the way in which we compute, packing more and more transistors on a single chip doesn’t increase the clock speed. By utilizing multicores, the speed increases- it can be used, for example, to allow multiple programs on a desktop system to always be executing concurrently. Multithreading make it easier to logically have many things going on in your program at a time, and can absorb the dead-time of other threads. Effective gains in performance are achieved through the use of both to speed up a single program. Ultimately though, multithreading is faking parallel computing and doesn’t address the heart of the problem. In parallel computing, there are many CPUs all working at the same time. Exploiting the benefits of concurrency within a single process on a single CPU with shared memory is at the very least a move in the right direction. Ultimately, parallel computing may solely have the potential to resurrect Moore’s Law and provide a platform for future economic growth and commercial innovation. Graphics processing units, or GPUs- a type of parallel computer- enable continued scaling of computing performance in today’s energy-constrained environment.The critical need is to build them in an energy-efficient manner in which many processing cores, each optimized for efficiency, not serial speed, work together on the solution of a problem. A fundamental advantage of parallel computers is that they efficiently turn more transistors into more performance. Doubling the number of processors causes many programs to go twice as fast. In contrast, doubling the number of transistors in a serial CPU results in a very modest increase in performance–at a tremendous expense in energy. The challenge is for the computing industry to adapt this new platform. After generations of serial programming, there is enormous resistance to change, as is present in many fields of study, since it requires a break with longstanding practices. Converting the enormous volume of existing serial programs to run in parallel is a formidable task, and one that is made even more burdensome by the scarcity of programmers trained in parallel programming.
Current transistors are primarily composed of silicon. This material eventually reaches a physical limit on the amount of transistors you can fit in a given area at the atomic level. Alternatively, carbon-based compounds such as graphene have been proposed for transistors. When a single monolayer of carbon atoms is extracted from nonconductive bulk graphite, electrical properties are observed contributing to semiconductor behavior, making it a feasible substitute for silicon. More research must be performed because it is not yet commercially viable for a number of reasons-one being its resistivity value increases decreasing electron mobility. Once achieved, integrated circuits will decrease in size dramatically since the same amount of transistors can fit in a significantly smaller area.
Not only is the density of transistors in integrated circuits vital, but also price per transistor will play an imperative role in changes in computing. Finding ways to lower cost of production will play a significant role in expediting the overall progress in computing. Despite the boffins observing that exponential growth is getting harder to achieve with constant expansion of research efforts, the free market will ultimately find a way to meet the demands of consumers. Sharing ideas and research efforts will only help counterbalance declining research productivity in order to maintain constant economic growth. As companies realize this, they will be incentivized to cooperate with one another. I think that cost is often a factor that goes unlooked when it comes to extending Moore’s Law.
At times, the information I provided is not pertinent to the topic of discussion. I found it difficult to adhere strictly to the scope of Moore’s Law without addressing these tangential ideas. Some people might view other factors as more imperative to Moore’s Law, although these seemed to catch my attention. I believe quantum computing will play a major role regarding limits in computation as well & is a whole different topic itself. What I also find captivating is the crossover between nanotechnology and biology. Some studies have shown how biological micro-cells are capable of impressive computational power while also being energy efficient. Taking advantage of evolutionary efficiency may help us to tackle future situations. The future of computing is bright and we all play a part in it!