Combining AI With React For A Smarter Frontend - The New Stack
Combining AI With React For A Smarter Frontend - The New Stack
Combining AI With React For A Smarter Frontend - The New Stack
VOXPOP
Try our new 5 second poll. It's fast. And it's fun!
How has the recent turmoil within the OpenAI offices changed your plans to use
FOLLOW
GPT in aTNS
business process or product in 2024? TNS DAILY
SUBSCRIBE
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
I HAVE AN OPINION
Frontend development will have to incorporate artificial intelligence sooner rather than
later. The burning questions though are what does that even look like and must it be a
chatbot?
“Almost every application going forward is going to use AI in some capacity, AI is going
to wait for no one,” said Jesse Hall, a senior developer advocate at MongoDB, during
last week’s second virtual day of React Summit US. “In order to stay competitive, we
need to build intelligence into our applications in order to gain rich insights from our
data.”
For example, the results might provide a text summary or links to specific document
pages, he added.
“Imagine your React app has an intelligent chatbot with RAG [Retrieval Augmented
Generation] and vector embeddings. This chatbot could pull in real-time data, maybe
the latest product inventory, and offer it during a customer service interaction, [using]
RAG and vector embeddings,” he said. “Your React app isn’t just smart, it’s adaptable,
real-time and incredibly context-aware.”
To put a tech stack around that, he suggested developers could use Next.js version 13.5
with Vercel’s app router, then connect with OpenAI’s Chat GPT 3.5, Turbo and GPT 4.
Then LangChain could be a crucial part of the stack because it helps with data pre-
processing, routing data to the proper storage, and making the AI part of the app more
efficient, he said. He also suggested using Vercel’s AI SDK, an open source library
designed to build conversational, streaming user interfaces.
TRENDING STORIES
“It’s a game changer for AI applications, enabling us to provide a more contextual and
FOLLOW TNS TNS DAILY
meaningful user experience by storing our vector embeddings directly in our
application database, instead of bolting on yet another external service,” he said. “And
it’s not just vector search. MongoDB Atlas itself brings a new level of power to our
generative AI capabilities. “
When combined, this technology stack would enable smarter, more powerful React
applications, he said.
“Remember, the future is not just about smarter AI, but also about how well it’s
integrated into user-centric platforms like your next React-based project,” Hall said.
“It’s not merely about leveraging the power of GPT in React. It’s about taking your
React applications to the next level by making them intelligent and context-aware,”
Hall said. “We’re not just integrating AI into React, we’re optimizing it to be as smart
and context-aware as possible.”
There’s a huge demand for building intelligence into applications and to make faster,
personalized experiences for users, he added. Smarter apps will use AI-powered
models to take action autonomously for the user. That could look like a chatbot, but it
could also look like personalized recommendations and fraud detection.
“First, your apps drive competitive advantage by deepening user engagement and
satisfaction as they interact with your application,” he explained. “Second, your apps
unlock higher efficiency and profitability by making intelligent decisions faster on
fresher, more accurate data.”
AI will be used to power the user-facing aspects of applications, but it will also lead to
“fresh data and insights” from those interactions, which in turn will power a more
efficient business decision model, he said.
FOLLOW TNS TNS DAILY
“One of their key limitations is their static knowledge base,” he said. “They only know
what they’ve been trained on. There are integrations with some models now that can
search the internet for newer information. But how do we know that the information
that they’re finding on the internet is accurate? They can hallucinate very confidently, I
might add. So how can we minimize this?”
The models can be made to be real-time, adaptable and more aligned with specific
needs by using React, large language models and RAG, he explained.
“We’re not just integrating AI into React, we’re optimizing it to be as smart and
context-aware as possible,” he said.
He explained what’s involved with RAG, starting with vectors. Vectors are the building
blocks that allow developers to represent complex multidimensional data in a format
that’s easy to manipulate and understand. Sometimes, vectors are referred to as vector
embeddings, or just embedding.
For example, video games use 2-D and 3-D coordinates to know where objects are in
the games world. But what makes vectors important in AI is that they enable semantic
search, he said.
“In simpler terms, they let us find information that is contextually relevant, not just a
keyword search,” Hall said. “And the data source is not just limited to text. It can also
be images, video, or audio — these can all be converted to vectors.”
So step one would be creating vectors, and the way to do that is through an encoder.
Encoders define how the information is organized in the virtual space, and there are
different types of encoders that can organize vectors in different ways, Hall explained.
For example, there are encoders for text, audio, images, etc. Most of the popular
FOLLOW TNS TNS DAILY
encoders can be found on Hugging Face or OpenAI, he added.
Finally, RAG comes into play. RAG is “an AI framework for retrieving acts from an
external knowledge base to ground large language models (LLMs) on the most
accurate, up-to-date information and to give users insight into LLMs’ generative
process,” according to IBM.
“RAG leverages vectors to pull in real-time, context-relevant data and to augment the
capabilities of an LLM,” Hall explained. “Vector search capabilities can augment the
performance and accuracy of GPT models by providing a memory or a ground truth to
reduce hallucinations, provide up-to-date information, and allow access to private
data.”
Loraine Lawson is a veteran technology reporter who has covered technology issues from data
integration to security for 25 years. Before joining The New Stack, she served as the editor of the
banking technology site, Bank Automation News. She has...
TRENDING STORIES
SUBSCRIBE
The New stack does not sell your information or share it with unaffiliated third parties. By
continuing, you agree to our Terms of Use and Privacy Policy.
ARCHITECTURE ENGINEERING
OPERATIONS CHANNELS