· Equilibrium  · 4 min read

Equilibrium Launches Tweekit-MCP Server with Multi-Modal Capabilities Across the Maitrix All AI Hub

Tweekit's MCP server is now live — connecting universal file normalization and multi-modal processing to OpenAI GPT, Google Gemini, Mistral, Anthropic Claude, and every AI in the Maitrix hub via a single hosted endpoint, secured by Google Cloud and distributed across CPUcoin's PoPW network.

The Tweekit-MCP Server brings universal file normalization and multi-modal media processing to every major AI platform simultaneously — OpenAI GPT, Google Gemini, Groq, Mistral, Anthropic Claude, and the full Maitrix All AI Hub — through a single hosted endpoint, secured by Google Cloud and powered by CPUcoin’s decentralized Proof of Processed Work network.

San Francisco, March 16, 2026 — Equilibrium today announced the general availability of the Tweekit-MCP Server — a production-ready Model Context Protocol integration that gives every AI platform connected to the Maitrix All AI Hub native access to Tweekit.io’s universal file normalization and multi-modal media processing capabilities.

The server is live now. Any MCP-compatible AI client can connect to it today.


What the Tweekit-MCP Server Does

The Model Context Protocol (MCP) is the open standard that lets AI assistants connect to external tools as native capabilities — not bolt-on plugins. The Tweekit-MCP Server exposes Tweekit’s MediaRich-powered engine as three core MCP tools available to any connected AI model:

ToolWhat It Does
convertTransform any file — resize, reformat, crop, change background, extract a single page from multi-page documents
convert_urlSame transformations applied directly to a file at any URL
doctypeIdentify any file’s supported input and output format options

Supported file types: 400+, including PSD, TIFF, HEIC, RAW camera formats, PDF, multi-page CAD drawings, Office documents, and broadcast video — everything that breaks standard AI ingestion pipelines.

Multi-modal output: image-to-image, image-to-video, and custom AI pipeline processing. Any file in, any format out, at any dimension, in seconds.


Connecting to the Maitrix All AI Hub

Maitrix is the unified AI orchestration platform that brings the world’s leading AI models — OpenAI’s GPT series, Google Gemini, Groq, Mistral, Anthropic Claude, and more — into a single hub. Rather than maintaining separate integrations for each AI provider, Maitrix gives developers one connection point to route and orchestrate multi-model workflows.

With the Tweekit-MCP Server connected to Maitrix, every AI model in the hub gains:

  • Universal file ingestion — no more “file not supported” errors in any AI workflow
  • Format normalization on the fly — files are converted to the exact spec each model requires, automatically
  • Multi-modal readiness — images, video, documents, and audio all handled in a single pipeline

A file uploaded once through Maitrix is immediately available — correctly formatted — to GPT for vision analysis, Gemini’s multi-modal pipeline, Claude’s document processing, Groq’s fast inference, and any other model in the network. Simultaneously.


Connect in Minutes

Hosted endpoint (no install required):

https://mcp.tweekit.io/mcp

Point any MCP-compatible client at this URL with your API key and you’re live. Works with Claude Desktop, ChatGPT, Cursor, Continue IDE, and any HTTP MCP client.

Install via PyPI:

pip install tweekit-mcp

Requires Python 3.10+ and a Tweekit API key + secret. Get your API key at Tweekit.io.

Confirmed client compatibility:

  • Claude Desktop — full support
  • ChatGPT / OpenAI — HTTP MCP compatible
  • Groq — via the tweekit-mcp-docker-groq-e2b variant
  • Cursor, Continue IDE — partial support
  • Any MCP-compatible agent framework

Infrastructure: Google Cloud + CPUcoin PoPW

The Tweekit-MCP Server runs on Google Cloud Run — providing the enterprise-grade security, compliance, and global availability that enterprise customers require. All file processing meets enterprise security standards with data handled in Google’s managed infrastructure.

The underlying compute workload is distributed across CPUcoin’s Proof of Processed Work (PoPW) miner network — a global network of verified compute nodes where miners are rewarded only for verified, completed processing work. This architecture delivers:

  • Up to 5x the processing throughput of AWS Lambda
  • Up to 90% lower cost than traditional cloud compute
  • Automatic horizontal scaling as demand grows
  • No single point of failure

“Every AI model has a file format problem. Every multi-modal pipeline hits the same wall with real-world user files,” said Sean Barger, CEO of Equilibrium. “Tweekit-MCP solves that problem once, across every major AI platform, through Maitrix — with 25 years of file format expertise behind it, deployed on Google Cloud, and scaled by CPUcoin’s decentralized network. This is the infrastructure layer the AI ecosystem has been missing.”


Free Tier Available Now

PlanAPI CallsAccess
Free10,000 calls (first 30 days)Sign up at Tweekit.io
EnterpriseCustom volumeContact for SLA and on-premises options

The free tier includes full access to all MCP tools — convert, convert_url, and doctype — for the first 30 days.


Get Started


About Equilibrium

Equilibrium’s mission is to make AI ingestion effortless. Since 2004, Equilibrium has helped global brands — including Walmart, Disney, Adidas, Warner Bros, and the U.S. Department of Energy — automate digital asset workflows with the MediaRich engine. Tweekit.io and the Tweekit-MCP Server bring that 25-year legacy of file format expertise to the AI era.

Media Contact: [email protected]

About CPUcoin

CPUcoin is decentralized infrastructure for the compute demands of tomorrow. The CPUcoin Computing Global Network (CGN) harnesses idle CPU and GPU resources worldwide through its Proof of Processed Work consensus mechanism — delivering scalable, affordable compute for AI, media, and enterprise applications at a fraction of traditional cloud costs. Learn more at cpucoin.io.

Back to Blog

Related Posts

View All Posts »