Our next video is out! This is the first of a 2-part video in which Prof. Semih Salihoğlu explains about Resource Description Framework (RDF), a popular data model in graph DBMSs. If you're curious about graph modeling and are intimidated by the terminology used (especially when it comes to RDF), this video has you covered! ✅ Fundamentals of RDF: What are resources, URIs, literals, and (subject, predicate, object) triples, and how do these form a graph? ✅ Advantages of using RDF as a data model for complex and irregular domains and its flexibility in handling irregularities and connections between objects ✅ How RDF allows you to query both data and schema information in a uniform way. In the 2nd video (coming soon!), we will also go over RDF's powerful knowledge representation and reasoning capabilities, and their potential in the age of AI. Stay tuned! https://lnkd.in/gZM79U44
Kùzu Inc.’s Post
More Relevant Posts
-
This latest video in Semih Salihoğlu graph database fundamentals series gives a truly excellent intro to RDF! It gently introduces RDF terminology along with concrete examples to help understand the benefits of #rdf and where it can be useful. Some Kùzu Inc. resources are also linked in the video description. Having studied these topics for a while myself, I remember how intimidated I was in my early days because it all felt so overwhelming, so I think this video will be very helpful for those who are getting started on their journey understanding the trade-offs between property graphs and RDF :). Looking forward to sharing part 2 of this video soon! #graph #database
Our next video is out! This is the first of a 2-part video in which Prof. Semih Salihoğlu explains about Resource Description Framework (RDF), a popular data model in graph DBMSs. If you're curious about graph modeling and are intimidated by the terminology used (especially when it comes to RDF), this video has you covered! ✅ Fundamentals of RDF: What are resources, URIs, literals, and (subject, predicate, object) triples, and how do these form a graph? ✅ Advantages of using RDF as a data model for complex and irregular domains and its flexibility in handling irregularities and connections between objects ✅ How RDF allows you to query both data and schema information in a uniform way. In the 2nd video (coming soon!), we will also go over RDF's powerful knowledge representation and reasoning capabilities, and their potential in the age of AI. Stay tuned! https://lnkd.in/gZM79U44
RDF - Part 1: What is Resource Description Framework, and its benefits?
https://www.youtube.com/
To view or add a comment, sign in
-
Interesting paper from ICML 24 for scalable RL. Stop Regressing: Training Value Functions via Classification for Scalable Deep RL https://lnkd.in/ghukW_sU
To view or add a comment, sign in
-
Here is the 2nd video on why RDF & its standards form a "knowledge representation & reasoning" (KRR) system. Don't miss the end of the video and the example I provide on interesting things one can do in principle with KRR (aka symbolic AI) systems! Enjoy!
In our latest video, Prof Semih Salihoğlu discusses why Resource Description Framework (RDF) and the standards around it form a knowledge representation and reasoning (KRR) system, and how one can do automatic reasoning with these standards. KRR systems, which are the foundation of what is nowadays referred to as symbolic AI or "good old fashioned AI" systems. We live in an AI-dominated world, and modern AI techniques such as machine learning or LLMs are based on statistical reasoning. While these techniques are proving to be extremely useful, they cannot (yet) explain the reasoning behind their answers, even when they get them correct. It may be that logic-based KRR systems (along with their underlying principles) could play an important role in addressing some of the shortcomings of existing statistics-based AI systems. If we ever see an emergence of hybrid statistical + symbolic AI systems, RDF and its standards may be a popular choice to build the symbolic components of such systems. We hope this series of two videos on #rdf and #graphs leaves you with plenty of food for thought! Subscribe to our YouTube channel here: https://lnkd.in/g57ZYs8n https://lnkd.in/g7xxgRbi
RDF - Part 2: RDF as knowledge representation and reasoning system
https://www.youtube.com/
To view or add a comment, sign in
-
In our latest video, Prof Semih Salihoğlu discusses why Resource Description Framework (RDF) and the standards around it form a knowledge representation and reasoning (KRR) system, and how one can do automatic reasoning with these standards. KRR systems, which are the foundation of what is nowadays referred to as symbolic AI or "good old fashioned AI" systems. We live in an AI-dominated world, and modern AI techniques such as machine learning or LLMs are based on statistical reasoning. While these techniques are proving to be extremely useful, they cannot (yet) explain the reasoning behind their answers, even when they get them correct. It may be that logic-based KRR systems (along with their underlying principles) could play an important role in addressing some of the shortcomings of existing statistics-based AI systems. If we ever see an emergence of hybrid statistical + symbolic AI systems, RDF and its standards may be a popular choice to build the symbolic components of such systems. We hope this series of two videos on #rdf and #graphs leaves you with plenty of food for thought! Subscribe to our YouTube channel here: https://lnkd.in/g57ZYs8n https://lnkd.in/g7xxgRbi
RDF - Part 2: RDF as knowledge representation and reasoning system
https://www.youtube.com/
To view or add a comment, sign in
-
I was struck again today by the 'simple' difference that adding a lot more parameters to a model makes. Using a 13 billion parameter model for a task we got reasonable results. Using a 70 billion parameter model from the same family was a lot more effective for a text generation task. The weird things is that some papers (see the Unreasonable Ineffectiveness of Deep Layers) suggest that a lot of the parameters in this larger model are underutilized or redundant. There are opportunities here for compression and tweaks that would make serving these models more efficient. It also sort of makes you wonder how effective these models could be if we unlocked the potential in those middle layers or parameters in these very large models.
To view or add a comment, sign in
-
Why LLMs fail at reproducing the same output for the same input. To know why, give a brief read about it in the article below.
To view or add a comment, sign in
-
This article would be very useful for many building voice based applications Typically , besides a ASR model ( typically Whisper) one needs other models combined with the main Whispee on the same API endpoint Here is a guidance how to do it ( with HF inference endpoints )
Powerful ASR + diarization + speculative decoding with Hugging Face Inference Endpoints
huggingface.co
To view or add a comment, sign in
-
Which A.I. system writes the best computer code or generates the most realistic image? Right now, there’s no easy way to answer those questions.
A.I. Has a Measurement Problem
https://www.nytimes.com
To view or add a comment, sign in
-
I learned how you can use prompt tuning to enhance LLM performance.
Introduction to Large Language Models
cloudskillsboost.google
To view or add a comment, sign in
-
A.I. Has a Measurement Problem: Which A.I. system writes the best computer code or generates the most realistic image? Right now, there’s no easy way to answer those questions. https://lnkd.in/dQMAjGkE
A.I. Has a Measurement Problem
https://www.nytimes.com
To view or add a comment, sign in
967 followers