Digital computation will not bring artificial intelligence Scientists from Stanford university describe in simple words why the current digital technology is inadequate for AI: "The place where computing went wrong, unfortunately, was the digital decision," Surya Ganguli, an associate professor of applied physics at Stanford, told scientists, academics, and other experts gathered at the HAI at Five conference today. "We decided to store information in bits which were in turn stored and flipped by shuttling many, many electrons around through complicated transistor circuits. Every fast and reliable bit flip requires, by the laws of thermodynamics, a large energy expenditure. So we expend a lot of energy in the intermediate sets of the computation. "Biology is completely different. The final answer is just good enough and all the intermediate sets are slow, noisy, and unreliable. But not so unreliable that the final answer isn't just good enough for what's required ... So I think we have to rethink the entire technology stack from electrons to algorithms in order to really go from megawatts to watts." ——- Energy expenditure is only one dimension of intelligence. It’s is important but not necessarily critical. What is becoming evident with the latest in digital technologies, specifically in GPUs, is that the current digital model is not viable for artificial general intelligence or AGI. If we were to follow the exponential diagrams pictured in one of the singularity advocates, Ray Kurzweil, we should have been able to create the intelligence of small mammals. And we are not there yet. So far what digital AI is demonstrating is that: * requires a lot of data * that data needs a lot of pre-processing * the algorithms require immense computational power of the digital kind * and this computational power requires lots of electricity, enough to start competing with human requirements Digital computation was born by the necessity of war, by evolving from the relay switch, to the vacuum tube, to the transistor, then, to the modern chips, which use electrons to define the binary state. There is a limit to what we can do with digital. And with days it’s becoming evident that artificial intelligence has hit a wall. Stanford it’s possibly one of many scientific institutions that has realized about it and we need to come up with a new computational model. #AI #artificialIntelligence #science #digitalComputation
Alfonso R. Reyes’ Post
More Relevant Posts
-
"University of Texas at Dallas researchers have developed an artificial intelligence (AI) model that could help electrical grids prevent power outages by automatically rerouting electricity in milliseconds. The UT Dallas researchers, who collaborated with engineers at the University at Buffalo in New York, demonstrated the automated system in a study published online June 4 in Nature Communications. The approach is an early example of "self-healing grid" technology, which uses AI to detect and repair problems such as outages autonomously and without human intervention when issues occur, such as storm-damaged power lines." #electricalgrid #ai #energy
Researchers engineer AI path to prevent power outages
techxplore.com
To view or add a comment, sign in
-
Explore the future of computing with biocomputing, where living cells like human brain organoids are used to reduce AI's energy demands. Discover the ethical challenges, power efficiency, and potential applications of this groundbreaking technology https://hubs.la/Q02L_H070
A New Era in Artificial Intelligence
blog.virtualmedicalcoaching.com
To view or add a comment, sign in
-
Calculating faster: Coupling AI with fundamental physics https://lnkd.in/etxzt9Ry
Calculating faster: Coupling AI with fundamental physics
phys.org
To view or add a comment, sign in
-
This article sounds positive though I can't help but wonder how 'new' it is. The concept sounds like something that electronics engineers were doing 30+ years ago in areas such as digital signal processing. I also want to believe that this is already baked into the hardware architecture. Perhaps I expect too much 🤨 Summary below, link at the end... Researchers from the University of Copenhagen have developed a ground-breaking technique that significantly reduces the energy consumption of artificial intelligence (AI) by up to 300%. This method revolves around an innovative approach to machine learning that optimizes computational efficiency, potentially leading to greener and more sustainable AI applications. By addressing one of the critical challenges in AI, this advancement could mark a pivotal step towards making AI technologies more environmentally friendly and cost-effective. This new technique holds promise for industries becoming reliant on AI by cutting down on the substantial energy demands traditionally associated with machine learning processes. Such energy-efficient advancements not only contribute to lower operating costs but also align with global sustainability goals. As AI continues to evolve and integrate into various sectors, the potential for reducing its carbon footprint represents a significant stride towards a more sustainable technology landscape. #AI #Engineering #computerscience #algorithms #electronics
This New Technique Slashes AI Energy Use by 95% - Decrypt
decrypt.co
To view or add a comment, sign in
-
Calculating faster: Coupling AI with fundamental physics https://buff.ly/3WWLxHR
Calculating faster: Coupling AI with fundamental physics
phys.org
To view or add a comment, sign in
-
Calculating faster: Coupling AI with fundamental physics https://buff.ly/3WWLxHR
Calculating faster: Coupling AI with fundamental physics
phys.org
To view or add a comment, sign in
-
Explore the future of computing with biocomputing, where living cells like human brain organoids are used to reduce AI's energy demands. Discover the ethical challenges, power efficiency, and potential applications of this groundbreaking technology https://hubs.la/Q02LQyxr0
A New Era in Artificial Intelligence
blog.virtualmedicalcoaching.com
To view or add a comment, sign in
-
The Future of Science: AI’s Role in Uncovering Mysteries Artificial intelligence is not just reshaping industries; it's revolutionizing how we approach science, physics, and biology. The combination of advanced AI models and increasing computing power is opening doors to possibilities once considered beyond reach. Recent advances in AI—particularly in neural networks and reasoning capabilities—are transforming scientific research. New benchmarks indicate that AI systems may soon rival or surpass human PhDs in specialized reasoning tasks. For a long time, complex human reasoning was thought to be uniquely difficult for machines, but this is no longer the case. One powerful AI application is machine learning, where algorithms create predictive models. This has revolutionized research speed. Tasks like material modeling that once took years can now be completed in days or hours. Similarly, AI is accelerating the discovery of new molecules for drug treatments, significantly reducing the research time. For example, AI can predict membrane partitioning profiles in a fraction of the time traditionally required. A major AI-driven breakthrough was the sequencing of the human genome, launched in 1990 and completed by 2003—years ahead of schedule thanks to rapid technological advances. Today, AI tools like AlphaFold predict the structure of nearly all known proteins, speeding up drug discovery and genetic research. In addition, AI is pushing the limits of chip design. Moore's Law—the doubling of transistors on a microchip every two years—held for decades, but AI is speeding up this pace. Computational capacity is now doubling every six months due to breakthroughs in AI architectures, quantum computing, and materials science, driving unprecedented growth in all scientific fields. We live in extraordinary times. The next decade may bring discoveries that could fundamentally change our lives, possibly faster than expected. Technologies like the personal computer, mobile phones, and the internet once seemed futuristic, but today they are indispensable. AI may soon revolutionize science in a similar way. Prepare yourself for the next wave of scientific breakthroughs. What do you think? How fast will AI change science and our lives? Share your thoughts below and join the conversation. Tags: #ArtificialIntelligence #Biology #ScientificInnovation #AIinScience #MachineLearning #AIProgress #FutureOfScience
To view or add a comment, sign in
-
I am engaged in quantum machine learning research, an interplay between machine learning and quantum computing. Although leveraging quantum computing to enhance or revolutionize AI technologies is still far away, the use of generative AI for quantum is happening, which is my current research focus. Generative AI for Quantum involves a better understanding of the representation, generalization, and trainability of quantum machine learning models (e.g., quantum neural networks) and even helps to scale up the quantum models for scientific computation in dealing with natural data like genomics and life sciences. Zheng Cui, Ph.D. Samuel Yen-Chi Chen Huck Yang
Principal at Boston Consulting Group (BCG) focusing on Tech & strategy, with a passion in disruptive technologies (e.g., quantum computing)
Quantum and AI enthusiasts! Tell me how you think Quantum Computing and AI/GenAI will impact each other? AI is making strides in many cool areas in biology, material science, weather forecasting, etc., which were thought to be promising areas for future quantum applications https://lnkd.in/gWJ56ftg https://lnkd.in/gKSztHqE https://lnkd.in/gJHid6vx https://lnkd.in/gg72cR9k Will the more imminent arrival of powerful AI in these areas obscure the future of quantum computing? My intuition is no, given the dependency on AI training data and lack of evidence that AI can solve NP-hard problems. Here is another simple heuristic argument related to GenAI/LLM: look at the "fundamental" scaling laws 🔹 Scaling laws for LLM say (empirically) if you increase model/data size, you get better performance. (This is one of the main reasons behind the confidence of pouring in billions of $ of investments) 🔹 Scaling in quantum computing can be thought of as the usual "exponential increase in computational power for linear increase in system size" ➡ So it would appear the two are meant to operate in totally different regimes: LLMs consume almost exponentially* (more accurately a power law but very steep) more resources for a linear increase in performance, whereas the whole point of quantum computing is to get exponentially more powerful with linear increase in resources ➡ What this means is that the advantage of quantum computing for intractably difficult problems will likely not be disrupted by AI What do you think? Leave me your thoughts! #Quantum #QuantumComputing #Technology #AI #GenAI #LLM
To view or add a comment, sign in