**London**: Memgraph has unveiled version 3.0 of its in-memory graph database, designed to enhance generative AI integration with features like vector search and GraphChat, enabling natural language queries. The updates aim to improve data processing and accessibility for developers lacking in graph theory expertise.
Memgraph, an innovative provider of in-memory graph database technology, unveiled its latest offering, Memgraph 3.0, on Wednesday, introducing several new features focused on enhancing generative AI applications. The development aims to facilitate the integration of real-time data analysis with natural language processing (NLP) capabilities, addressing specific needs in generative AI workloads.
Founded in 2016 by Dominik Tomicevic and Marcko Budiselić, Memgraph was designed to meet the demands of applications requiring quick responses on dynamic datasets, such as fraud detection and supply chain planning. Traditional graph databases, such as Neo4j, are generally batch-oriented and operate slower when faced with high-velocity write operations. Tomicevic explains that “if you have lots of writes per second (hundreds of thousands or millions per second), Neo4j can’t handle that kind of writes per second, especially being responsive at the same time to the read queries and analytics.”
Memgraph overcomes these limitations by storing all data in RAM, which allows for rapid data ingestion and the execution of complex analytics and data science algorithms on the entire graph. However, this method does come with limitations in terms of scalability; Memgraph databases typically consist of hundreds of millions of nodes and edges, with some reaching single-digit billions, compared to the much larger capabilities of Neo4j.
With the rollout of Memgraph 3.0, the company is venturing further into the realm of generative AI through its newly introduced features, one of which is vector search. By enabling graph data to be stored as vector embeddings, this functionality provides users the ability to deliver effective relationships defined by graph nodes and edges into the context windows of language models, thereby enhancing output quality in RAG (retrieval-augmented generation) scenarios. Tomicevic noted that “language model context windows are getting very large,” citing Google’s Gemini 2.0, which can accommodate 2 million tokens. However, he cautioned that merely having a large context does not guarantee the selection of the right information, emphasizing the need for advanced data organisation techniques via graph algorithms.
In addition, the GraphRAG capability is expected to enhance the accuracy of language models, reducing the likelihood of generating inconsistent or erroneous content. Tomicevic acknowledged existing challenges with large language models, stating, “LLMs are terrible at accounting, for example. They’re also terrible at hierarchical relationships and thinking.” By leveraging graph structures, users can prompt models to process hierarchical data effectively, thereby improving overall response quality.
Alongside the advancements in generative AI, Memgraph 3.0 also introduced GraphChat, a natural language interface that allows users to query graph data in plain English. This feature aims to broaden accessibility for those lacking extensive expertise in graph theory. Tomicevic remarked, “Graphs are very powerful. They can do a lot of things,” and expressed optimism that GraphChat would empower developers to tackle graph data science without needing extensive specialised knowledge.
The recent updates to Memgraph also support models developed by DeepSeek, a relatively new player in AI, and introduced various enhancements aimed at bolstering performance and reliability across the platform.
Through these developments, Memgraph positions itself as a critical contributor to the evolving landscape of generative AI, particularly within the domain of graph databases, while also aiming to make sophisticated technology more accessible to a wider audience of developers. For further details on Memgraph’s new functionalities, interested individuals can visit the company’s website.
Source: Noah Wire Services