ai.smithery/infranodus-mcp-server-infranodus

平台与服务

by infranodus

将文本映射为 knowledge graphs,构建概念关系的结构化表示,用于分析主题与关联。

什么是 ai.smithery/infranodus-mcp-server-infranodus

将文本映射为 knowledge graphs,构建概念关系的结构化表示,用于分析主题与关联。

README

InfraNodus MCP Server

A Model Context Protocol (MCP) server that integrates InfraNodus knowledge graph and text network analysis capabilities into LLM workflows and AI assistants like Claude Desktop.

Overview

InfraNodus MCP Server enables LLM workflows and AI assistants to analyze text using advanced network science algorithms, generate knowledge graphs, detect content gaps, and identify key topics and concepts. It transforms unstructured text into structured insights using graph theory and network analysis.

InfraNodus MCP Server

Features

You Can Use It To

  • Connect your existing InfraNodus knowledge graphs to your LLM workflows and AI chats
  • Identify the main topical clusters in discourse without missing the important nuances (works better than standard LLM workflows)
  • Identify the content gaps in any discourse (helpful for content creation and research)
  • Generate new knowledge graphs from any text and use them to augment your LLM responses
  • Save and retrieve entities and relations from memory using the knowledge graphs

Available Tools

  1. generate_knowledge_graph

    • Convert any text into a visual knowledge graph
    • Extract topics, concepts, and their relationships
    • Identify structural patterns and clusters
    • Apply AI-powered topic naming
    • Perform entity detection for cleaner graphs
  2. analyze_existing_graph_by_name

    • Retrieve and analyze existing graphs from your InfraNodus account
    • Access previously saved analyses
    • Export graph data with full statistics
  3. analyze_text

    • Analyze a text, URL, or YouTube transcript
    • Extract and analyze a graph from text or URL; provide either text or url
    • Get topics, clusters, statements, graph structure, and AI summary as requested
  4. generate_content_gaps

    • Detect missing connections in discourse
    • Identify underexplored topics
    • Generate research questions
    • Suggest content development opportunities
  5. generate_topical_clusters

    • Generate topics and clusters of keywords from text using knowledge graph analysis
    • Make sure to beyond genetic insights and detect smaller topics
    • Use the topical clusters to establish topical authority for SEO
  6. generate_contextual_hint

    • Generate a topical overview of a text and provide insights for LLMs to generate better responses
    • Use it to get a high-level understanding of a text
    • Use it to augment prompts in your LLM workflows and AI assistants
  7. generate_research_questions

    • Generate research questions that bridge content gaps from text, URL, or an existing InfraNodus graph
    • Use them as prompts in your LLM models and AI workflows
    • Use any AI model (included in InfraNodus API)
    • Content gaps are identified based on topical clustering
  8. generate_research_ideas

    • Generate innovative research ideas based on content gaps identified in the text
    • Get actionable ideas to improve the text and develop the discourse
    • Use any AI model (included in InfraNodus API)
    • Ideas are generated from gaps between topical clusters
  9. optimize_text_structure

    • Analyze the level of bias and coherence in text using knowledge graph analysis
    • If the text is too biased, develop the represented topics to balance the discourse
    • If the text is focused or diversified, develop the content gaps to deepen the analysis
    • If the text is dispersed, focus the most common gap topics to improve coherence
    • Choose response type: response, idea, question, or transcend
  10. generate_responses_from_graph

    • Generate responses based on an existing InfraNodus graph
    • Integrate them into your LLM workflows and AI assistants
    • Use any AI model (included in InfraNodus API)
    • Use any prompt
  11. develop_conceptual_bridges

    • Analyze text and develop latent ideas based on concepts that connect this text to a broader discourse
    • Discover hidden themes and patterns that link your text to wider contexts
    • Use any AI model (included in InfraNodus API)
    • Generate insights that help develop the discourse
  12. develop_latent_topics

    • Analyze text and extract underdeveloped topics with ideas on how to develop them
    • Identify topics that need more attention and elaboration
    • Use any AI model (included in InfraNodus API)
    • Get actionable suggestions for content expansion
  13. develop_text_tool

    • Comprehensive text analysis combining content gap ideas, latent topics, and conceptual bridges
    • Executes multiple analyses in sequence with progress tracking
    • Generates research ideas based on content gaps
    • Identifies latent topics and conceptual bridges to develop
    • Finds content gaps for deeper exploration
  14. create_knowledge_graph

    • Create a knowledge graph in InfraNodus from text and provide a link to it
    • Use it to create a knowledge graph in InfraNodus from text
  15. overlap_between_texts

    • Create knowledge graphs from two or more texts and find the overlap (similarities) between them
    • Use it to find similar topics and keywords across different texts
  16. merged_graph_from_texts

    • Build a graph of all the texts and URLs provided, providing topical clusters and gaps present in the merged graph generated from all the texts
    • Use it to combine multiple sources into one graph and see clusters and content gaps across the merged content
  17. difference_between_texts

    • Compare knowledge graphs from two or more texts and find what's not present in the first graph that's present in the others
    • Use it to find how one text can be enriched with the others
  18. analyze_google_search_results

    • Generate a graph with keywords and topics for Google search results for a certain query
    • Use it to understand the current informational supply (what people find)
  19. analyze_related_search_queries

    • Generate a graph from the search queries suggested by Google for a certain query
    • Use it to understand the current informational demand (what people are looking for)
  20. search_queries_vs_search_results

    • Generate a graph of keyword combinations and topics people tend to search for that do not readily appear in the search results for the same queries
    • Use it to understand what people search for but don't yet find
  21. generate_seo_report

    • Analyze content for SEO optimization by comparing it with Google search results and search queries
    • Identify content gaps and opportunities for better search visibility
    • Get comprehensive analysis of what's in search results but not in your text
    • Discover what people search for but don't find in current results
  22. memory_add_relations

    • Add relations to the InfraNodus memory from text
    • Automatically detect entities or use [[wikilinks]] syntax to mark them
    • Save memory to a specified graph name for future retrieval
    • Support automatic entity extraction or manual entity marking
    • Provide links to created memory graphs for easy access
  23. memory_get_relations

    • Retrieve relations from InfraNodus memory for specific entities
    • Search for entity relations using [[wikilinks]] syntax
    • Query specific memory contexts or search across all memory graphs
    • Extract statements and relationships from stored knowledge graphs
    • Support both entity-specific searches and full context retrieval
  24. retrieve_from_knowledge_base

    • Retrieve context from an existing InfraNodus knowledge graph using GraphRAG
    • Query your knowledge base with a natural language prompt to get relevant statements
    • Include graph summaries for quick overviews of the knowledge structure
    • Optionally retrieve the full graph, statements, or extended analysis
    • Ideal for augmenting LLM responses with domain-specific knowledge
  25. search

    • Search through existing InfraNodus graphs
    • Also use it to search through the public graphs of a specific user
    • Compatible with ChatGPT Deep Research mode via Developer Mode > Connectors
  26. fetch

    • Fetch a specific search result for a graph
    • Can be used in ChatGPT Deep Research mode via Developer Mode > Connectors

More capabilites coming soon!

Key Capabilities

  • Topic Modeling: Automatic clustering and categorization of concepts
  • Content Gap Detection: Find missing links between concept clusters
  • Entity Recognition: Clean extraction of names, places, and organizations
  • AI Enhancement: Optional AI-powered topic naming and analysis
  • Structural Analysis: Identify influential nodes and community structures
  • Network Structure Statistics: Modularity, centrality, betweenness, and other graph metrics
  • Knowledge Graph Memory: Save and retrieve knowledge graph memories and analyze them to retrieve key nodes, clusters, and connectors

Knowledge Graph Memory Use Advice

InfraNodus represents any text as a network graph in order to identify the main clusters of ideas and gaps between them. This helps generate advanced insights based on the text's structure. The network is effectively a knowledge graph that can also be used to retrieve complex ontological relations between different entities and concepts. This process is automated in InfraNodus using the search and fetch tools along with the other tools that analyze the underlying network.

However, you can also easily use InfraNodus as a more traditional memory server to save and retrieve relations. We use [[wikilinks]] to highlight entities in your text to make your content and graphs compatible with markup syntax and PKM tools such as Obsidian. By default, InfraNodus will generate the name of the memory graph for you based on the context of the conversation. However, you can modify this default behavior by adding a system prompt or project instruction into your LLM client.

Specifically you can specify to always use a speciic knowlege graph for memories to store everything in one place:

code
Save all memories in the `my-memories` graph in InfraNodus.

Or you can ask InfraNodus to only save certain entities, e.g. for building social networks:

code
When generating entities, only extract people, companies, and organizations. Ignore everything else.

Installation

The easiest and the fastest way to launch the InfraNodus MCP server is to either use our server URL https://mcp.infranodus.com for the remote / web applications or to add a manual configuration to your LLM apps if you're running them locally.

You can also install the server locally, so you have more control over it. In this case, you can also edit the source files and even create your tools based on the InfraNodus API.

Below we describe the two different ways to set up your InfraNodus MCP server.

1. Easiest Setup: InfraNodus MCP Server (via HTTP/SSE)

  1. Prerequisites
  1. Get the URL
  • We currently use the following URL for our MCP server deployed in our infrastructure:
bash
https://mcp.infranodus.com
  1. Add the MCP server URL to the Client Tool Where You Want to Use InfraNodus
  • Once you add the URL above to your tool, it will automatically prompt you to authenticate using OAuth in order to be able to access the InfraNodus MCP hosted on it.
  1. Using InfraNodus Tools in Your Calls
  • To use InfraNodus, see the tools available and simply call them through the chat interface (e.g. "show me the graphs where I talk about this topic" or "get the content gaps from the document I uploaded")

  • If your client is not using InfraNodus for some actions, add the instruction to use InfraNodus explicitly.

2. Manual Setup: via NPX

You can deploy the InfraNodus server manually via npx — a package that allows to execute local and remote Node.Js packages on your computer.

The InfraNodus MCP server is available as an npm package at https://www.npmjs.com/package/infranodus-mcp-server from where you can launch it remotely on your local computer with npx. It will expose its tools to the MCP client that will be using this command to launch the server

For Claude Desktop / Cursor IDE:

Just add this in your Claude's configuration file (Settings > Developer > Edit Config), inside the "mcpServers" object where the different servers are listed:

json
{
	"mcpServers": {
		"infranodus": {
			"command": "npx",
			"args": ["-y", "infranodus-mcp-server"],
			"env": {
				"INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY"
			}
		}
	}
}

For Claude Code

To connect the InfraNodus MCP server to your Claude code, you can use this command. Make sure to provide the correct InfraNodus API key for your account:

bash
claude mcp add infranodus -s user \
	-- env INFRANODUS_API_KEY=YOUR_INRANODUS_KEY \
		npx -y infranodus-mcp-server

3. Manual Setup: Launching MCP as a Local Server (for inspection & development)

  1. Prerequisites
  1. Clone and build the server:

    bash
    git clone https://github.com/yourusername/mcp-server-infranodus.git
    cd mcp-server-infranodus
    npm install
    npm run build:inspect
    

Note that build:inspect will generate the dist/index.js file which you will then use in your server setup. The standard npm run build command will only build a Smithery file.

  1. Set up your API key:

    Create a .env file in the project root:

    code
    INFRANODUS_API_KEY=your-api-key-here
    
  2. Inspect the MCP:

    bash
    npm run inspect
    

Claude Desktop Configuration (macOS)

  1. Open your Claude Desktop configuration file:

    bash
    open ~/Library/Application\ Support/Claude/claude_desktop_config.json
    
  2. Add the InfraNodus server configuration:

a. remote launch via npx:

json
{
	"mcpServers": {
		"infranodus": {
			"command": "npx",
			"args": ["-y", "infranodus-mcp-server"],
			"env": {
				"INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY"
			}
		}
	}
}

b. launch this repo with node, specify the absolute path to the repo + /dist/index.js:

json
{
	"mcpServers": {
		"infranodus": {
			"command": "node",
			"args": ["/absolute/path/to/mcp-server-infranodus/dist/index.js"],
			"env": {
				"INFRANODUS_API_KEY": "your-api-key-here"
			}
		}
	}
}

Note: you can leave the INFRANODUS_API_KEY empty in which case you can make 70 free requests after which you will hit quota and will need to add your API key.

  1. Restart Claude Desktop to load the new server.

Claude Desktop Configuration (Windows)

  1. Open your Claude Desktop configuration file:

    code
    %APPDATA%\Claude\claude_desktop_config.json
    
  2. Add the InfraNodus server configuration:

a. remote launch via npx:

json
{
	"mcpServers": {
		"infranodus": {
			"command": "npx",
			"args": ["-y", "infranodus-mcp-server"],
			"env": {
				"INFRANODUS_API_KEY": "YOUR_INFRANODUS_API_KEY"
			}
		}
	}
}

b. launch this repo with node:

json
{
	"mcpServers": {
		"infranodus": {
			"command": "node",
			"args": ["C:\\path\\to\\mcp-server-infranodus\\dist\\index.js"],
			"env": {
				"INFRANODUS_API_KEY": "your-api-key-here"
			}
		}
	}
}
  1. Restart Claude Desktop.

Cursor Configuration

Other MCP-Compatible Applications

For other applications supporting MCP, use the following command to start the server via npx:

bash
INFRANODUS_API_KEY=your-api-key npx -y infranodus-mcp-server

or locally

bash
INFRANODUS_API_KEY=your-api-key node /path/to/mcp-server-infranodus/dist/index.js

The server communicates via stdio, so configure your application to run this command and communicate through standard input/output.

Legacy Setup via Smithery

InfraNodus server is also available through Smithery: a repository of MCP servers that has an easy-to-follow installation process for most LLM clients. You will need a separate accout at Smithery though.

For Cursor:
json
// e.g. Cursor will access directly the server via Smithery
"mcpServers": {
    "mcp-server-infranodus": {
      "type": "http",
      "url": "https://server.smithery.ai/@infranodus/mcp-server-infranodus/mcp?api_key=YOUR_SMITHERY_KEY&profile=YOUR_SMITHERY_PROFILE",
      "headers": {}
    }
  }

For Claude:

json
// Claude uses a slightly different implementation
// Fot this, it launches the MCP server on your local machine
"mcpServers": {
   "mcp-server-infranodus": {
			"command": "npx",
			"args": [
				"-y",
				"@smithery/cli@latest",
				"run",
				"@infranodus/mcp-server-infranodus",
				"--key",
				"YOUR_SMITHERY_KEY",
				"--profile",
				"YOUR_SMITHERY_PROFILE"
			]
		}
  }

Note, in both cases, you'll automatically get the YOUR_SMITHERY_KEY and YOUR_SMITHERY_PROFILE values from Smithery when you copy the URL with credentials. These are not your InfraNodus API keys. You can use the InfraNodus API server without the API for the first 70 calls. Then you can add it to your Smithery profile and it will automatically connect to your account using the link above.

Usage Examples

Once installed, you can ask Claude to:

  • "Use InfraNodus to analyze this text and show me the main topics"
  • "Generate a knowledge graph from this document"
  • "Find content gaps in this article"
  • "Retrieve my existing graph called 'Research Notes' from InfraNodus"
  • "What are the structural gaps in this text?"
  • "Identify the most influential concepts in this content"

Development

Running in Development Mode

bash
npm run dev

Using the MCP Inspector

Test the server with the MCP Inspector:

bash

npm run build:inspect
npm run inspect

Building from Source

bash
npm run build

Watching for Changes

bash
npm run watch

API Documentation

generate_knowledge_graph

Analyzes text and generates a knowledge graph.

Parameters:

  • text (string, required): The text to analyze
  • includeStatements (boolean): Include original statements in response
  • modifyAnalyzedText (string): Text modification options ("none", "entities", "lemmatize")

analyze_existing_graph_by_name

Retrieves and analyzes an existing graph from your InfraNodus account.

Parameters:

  • graphName (string, required): Name of the existing graph
  • includeStatements (boolean): Include statements in response
  • includeGraphSummary (boolean): Include graph summary

analyze_text

Analyze a text, URL, or YouTube transcript. Extract and analyze a graph from text or URL; provide either text or url.

Parameters:

  • text (string, optional): Text to analyze. Provide either this or url.
  • url (string, optional): URL to fetch content from (e.g. webpage or YouTube transcript). Provide either this or text.
  • includeStatements (boolean): Include processed statements in response
  • includeGraph (boolean): Include full graph structure in response
  • addNodesAndEdges (boolean): Include nodes and edges in response
  • includeGraphSummary (boolean): Include AI-generated graph summary for RAG prompt augmentation
  • modifyAnalyzedText (string): Entity detection — "none", "detectEntities", or "extractEntitiesOnly"

generate_content_gaps

Identifies content gaps and missing connections in text.

Parameters:

  • text (string, required): The text to analyze for gaps

Progress Notifications

For long-running operations (like SEO analysis), the MCP server supports real-time progress notifications that provide intermediary feedback to AI agents. This allows agents to:

  • Track the progress of multi-step operations
  • Display status messages to users
  • Understand what's happening during lengthy analyses

Implementation

The server implements MCP progress notifications using:

  1. ToolHandlerContext: All tool handlers can receive an optional context parameter containing the server instance and progress token
  2. ProgressReporter: A utility class that simplifies sending progress updates with percentages and messages
  3. Wrapped Handlers: Tool registration automatically injects the server context into handlers

Example Usage in Tools

typescript
import { ProgressReporter } from "../utils/progress.js";
import { ToolHandlerContext } from "../types/index.js";

handler: async (params: ParamType, context: ToolHandlerContext = {}) => {
	const progress = new ProgressReporter(context);

	await progress.report(25, "Fetching data from API...");
	// Do work

	await progress.report(75, "Analyzing results...");
	// More work

	await progress.report(100, "Complete!");
	return results;
};

The generate_seo_report tool demonstrates this pattern with 6 major progress checkpoints that provide detailed status updates throughout the multi-step analysis process.

Troubleshooting

Server doesn't appear in Claude

  1. Verify the configuration file path is correct
  2. Check that the API key is valid
  3. Ensure Node.js is in your system PATH
  4. Restart Claude Desktop completely

API Key Issues

Build Errors

bash
# Clean install
rm -rf node_modules package-lock.json
npm install
npm run build

Resources

License

MIT

Support

For issues related to:

常见问题

ai.smithery/infranodus-mcp-server-infranodus 是什么?

将文本映射为 knowledge graphs,构建概念关系的结构化表示,用于分析主题与关联。

相关 Skills

MCP构建

by anthropics

Universal
热门

聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。

想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。

平台与服务
未扫描111.1k

Slack动图

by anthropics

Universal
热门

面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。

帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。

平台与服务
未扫描111.1k

MCP服务构建器

by alirezarezvani

Universal
热门

从 OpenAPI 一键生成 Python/TypeScript MCP server 脚手架,并校验 tool schema、命名规范与版本兼容性,适合把现有 REST API 快速发布成可生产演进的 MCP 服务。

帮你快速搭建 MCP 服务与后端 API,脚手架完善、扩展顺手,尤其适合想高效验证服务能力的开发者。

平台与服务
未扫描9.6k

相关 MCP Server

Slack 消息

编辑精选

by Anthropic

热门

Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。

这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。

平台与服务
83.0k

by netdata

热门

io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。

这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。

平台与服务
78.3k

by d4vinci

热门

Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。

这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。

平台与服务
34.8k

评论