Complex Problem Solving can be addressed using advanced data analysis techniques, like those based on knowledge graphs. Julian Shun of MIT has made significant strides in developing parallel algorithms for processing large-scale networks efficiently. His research focuses on high-performance graph algorithms that accelerate complex data processing, essential for applications like fraud detection and personalized recommendations. His work on dynamic algorithms further enhances our ability to manage vast, evolving datasets, providing timely and reliable insights. Learn more about this here: https://lnkd.in/dakRQNiZ
José Manuel de la Chica’s Post
More Relevant Posts
-
I’m truly passionate about how computer science tries to represent the world's information in a structured and formalized way, so that it can be processed automatically by machines. As algorithms will later use that representation as raw material to provide predictions and others, the less information is lost in the representation, the more accurate the prediction will be. Current AI systems, rely on data types as: text, numbers, images, and we are starting to play with video or voice. However I think that they still struggle to fully represent the real-world complexities… Echoing Plato's belief that our representations don't fully capture reality, I think that there’s “something” that cannot be represented in that structured way. Citing another philosopher, Nietzsche once said: “Invisible threads are the strongest ties.” These threads (or relationships) can be thought of as linking not only tangible objects, like homes on a delivery route, but also abstract entities, such as users in social networks or transactions in a financial network. This information “stored” in the relationships is gold. 20 years ago, in my MSc thesis I worked with knowledge graphs and I believe that there will be a next wave of innovation coming from Network Science and Quantum Computing algorithms, reshaping (again) AI and pushing its boundaries to solve even more complex problems. #AI #ArtificialIntelligence #NetworkScience #QuantumComputing #Philosophy
Head of Generative AI. Ex-CTO Santander Universidades. Ex-BBVA. Tech Innovation in Financial Services. Exponential Technologies.
Complex Problem Solving can be addressed using advanced data analysis techniques, like those based on knowledge graphs. Julian Shun of MIT has made significant strides in developing parallel algorithms for processing large-scale networks efficiently. His research focuses on high-performance graph algorithms that accelerate complex data processing, essential for applications like fraud detection and personalized recommendations. His work on dynamic algorithms further enhances our ability to manage vast, evolving datasets, providing timely and reliable insights. Learn more about this here: https://lnkd.in/dakRQNiZ
Modeling relationships to solve complex problems efficiently
news.mit.edu
To view or add a comment, sign in
-
Amazing to see how Prof. Shun can revolutionize graph processing! His GraphIt framework performing 5x faster than previous solutions. What really resonates with me is his commitment to making these complex tools user-friendly - we desperately need more of this bridge between theoretical power and practical usability. As someone who deals with data networks, I appreciate how his work on dynamic algorithms could help us process real-time changes in massive datasets more efficiently. Associate Professor Julian Shun develops high-performance algorithms and frameworks for large-scale graph processing. https://lnkd.in/ey7EQrSj
Modeling relationships to solve complex problems efficiently
news.mit.edu
To view or add a comment, sign in
-
"Dijkstra’s algorithm doesn’t just tell you the fastest route to one destination. Instead, it gives you an ordered list of travel times from your current location to every other point that you might want to visit — a solution to what researchers call the single-source shortest-paths problem. The algorithm works in an abstracted road map called a graph: a network of interconnected points (called vertices) in which the links between vertices are labeled with numbers (called weights). These weights might represent the time required to traverse each road in a network, and they can change depending on traffic patterns. The larger a weight, the longer it takes to traverse that path."
A group of researchers have found an even better version of an algorithm that can tell you how to get from Point A to Point B in the most efficient way possible. https://lnkd.in/e5h6aAhg
Computer Scientists Establish the Best Way to Traverse a Graph
https://www.quantamagazine.org
To view or add a comment, sign in
-
For those of us in the wayfinding systems business, have a look at this article.
A group of researchers have found an even better version of an algorithm that can tell you how to get from Point A to Point B in the most efficient way possible. https://lnkd.in/e5h6aAhg
Computer Scientists Establish the Best Way to Traverse a Graph
https://www.quantamagazine.org
To view or add a comment, sign in
-
Data structures play a critical role in computer science. The effective organization and processing of data directly impact the performance of software applications. In this article, we will start with the concept of data and move on to fundamental data structures, algorithms, and their advantages.
Data Structures: The Cornerstone of Computer Science
link.medium.com
To view or add a comment, sign in
-
📢 Can we teach multiple skills to a text-to-image (T2I) model (w/o expensive annotations), while minimizing knowledge conflicts between skills? 🤔 👉 SELMA improves T2I models by fine-tuning on automatically generated multi-skill image-text datasets, with skill-specific LoRA expert learning & merging. ▶On Stable Diffusion (SD) models, SELMA boosts 5 metrics (TIFA/DSG/PickScore/ImageReward/HPS) + human eval. ▶Finetuning on auto-generated data matches the performance of finetuning on GT data. ▶Weak-to-strong generalization (SDv2 images can help SDXL fine-tuning). ➡ For a quick summary, check out our Twitter thread: https://lnkd.in/eW-Mverv Working with Jialu Li*, Jaemin Cho*, Yi-Lin Sung, and Mohit Bansal at the University of North Carolina at Chapel Hill. ArXiv: https://lnkd.in/e4S_dRB8 Project Page: https://lnkd.in/ew-apMsx Code: https://lnkd.in/ebfcbBsD
To view or add a comment, sign in
-
Who the HELL is Rob Prim?! Let's take a journey back in time and discover the fascinating story behind Prim's algorithm, a cornerstone in the world of graphs and data. First off, who was Prim? He wasn't a wizard or a superhero, but he was a genius! Robert C. Prim, an American mathematician, introduced this algorithm in the 1950s. Imagine him as an explorer, venturing into the uncharted territory of graph theory, seeking ways to solve complex problems. Like many great discoveries, Prim's algorithm was born out of necessity. Back in the day, engineers and planners faced a common challenge: how to connect points in a network while minimizing costs efficiently. It was like trying to build the most efficient road network to connect cities or power grids to supply electricity to homes. Now, let's fast forward to today and see how Prim's algorithm still works its magic in the digital age. With the power of computers and smart algorithms, we can use Prim's method to solve all sorts of real-world problems, from designing efficient transportation networks to optimizing data connections in computer networks. Let's dive into some code to see how Prim's algorithm works its magic: Check out my implementation on CodePen: (https://lnkd.in/eXQ_Q94x) And when we run this code, we get our MST, telling us the total cost to connect all the points in the graph. It's like having a treasure map that guides us to the most efficient routes and paths. But how does Prim's algorithm actually work? It's all about finding the shortest paths step by step, just like how explorers chart their course through unknown territories. Prim's algorithm starts from one point and keeps adding the next closest point until all points are connected. It's like connecting the dots to reveal the hidden picture! Now, let's talk about something called "asymptotic analysis." It sounds fancy, but it's basically a way to measure how fast an algorithm runs. For Prim's algorithm, it's O(n^2), where 'n' is the number of points in the graph. In simpler terms, it means the algorithm gets a bit slower as the graph gets bigger. So, there you have it—the tale of Prim's algorithm, a timeless masterpiece in the world of algorithms. It's like having a magical compass that guides us through the maze of data, helping us find the shortest paths and unlock hidden treasures. Keep exploring, keep learning, and let's conquer the world of algorithms together! 🌟 #PrimAlgorithm #MST #DataMagic
Prim's Algorithm
https://codepen.io
To view or add a comment, sign in
-
Unlocking Algorithmic Mysteries: Navigating the Boundaries of Big O! Dive into my latest article exploring the fascinating world of algorithmic complexity and discover when Big O notation might surprise you. #TechTalks #Algorithms #BigOExploration
The Limits of Big O: When Does It Fall Short?
link.medium.com
To view or add a comment, sign in
-
What will Mikkel Thorup speak about at the Probability in Computer Science PhD School in Fall? Hashing in Probabilistic Algorithms! Early registration deadline: September 1 More info and registration: https://lnkd.in/dJDhFG36 #PleaseShare #etaps ACM, Association for Computing Machinery, #SIGPLAN #SIGLOG #fopps Hash functions bridging the gap from theory to practice Randomized algorithms are often enjoyed for their simplicity, but the hash functions employed to yield the desired probabilistic guarantees are often too complicated to be practical. Hash functions are used everywhere in computing, e.g., hash tables, sketching, dimensionality reduction, sampling, and estimation. Abstractly, we like to think of hashing as fully-random hashing, assigning independent hash values to every possible key, but essentially this requires us to store the hash values for all keys, which is unrealistic for most key universes, e.g., 64-bit keys. In practice, we have to settle for implementable hash functions, and often practitioners settle for implementations that are too simple in that the algorithms ends up working only for sufficiently random input. However, the real world is full of structured/non-random input. The issue is severe, for simple hash functions will often work very well in tests with random input. Moreover, the issue is often that error events that should never happen in practice, happen with way too high probability. This does not show in a few tests, but will show up over time when you put the system in production. Over the last decade there has been major developments in simple tabulation based hash functions offering strong theoretical guarantees, so as to support fundamental properties such as Chernoff bounds, Sparse Johnson-Lindenstrauss transforms, succinct hash tables, fully-random hashing on a given set w.h.p. etc. I will discuss some of the principles of these developments and offer insights on how far we can bridge from theory (assuming fully-random hash functions) to practice (needing something that can actually implemented efficiently).
To view or add a comment, sign in
-
In the vast realm of computational problems, there’s a special category that has intrigued and challenged computer scientists for decades: the NP-complete problems. At its core, an NP-complete problem is a type of problem for which no efficient solution has been found, but if a solution is given, it can be verified quickly. Imagine you’re trying to solve a jigsaw puzzle. Finding the correct arrangement of pieces can be time-consuming. But if someone shows you the completed picture, you can instantly verify it’s correct. That’s the essence of NP-complete problems: hard to solve, but easy to verify. https://lnkd.in/dDUMw7-Z
Decoding Complexity: A Deep Dive into NP-complete Problems
medium.com
To view or add a comment, sign in
More from this author
-
Racing to 2025: 10 AI Transformations Shaping the Next Frontier
José Manuel de la Chica 6h -
The Role of AI in Realizing the Metaverse Vision: 10 reasons why Generative-AI accelerates the creation of a Real Metaverse.
José Manuel de la Chica 3mo -
Frontier technology for 2024: as AI accelerates convergence, the future will exceed our expectations.
José Manuel de la Chica 11mo