SciPhi

Streamline Retrieval-Augmented Generation system development with flexible infrastructure management, scalable compute resources, and cutting-edge techniques for AI innovation.
Artificial Intelligence Development Document Management Conversational AI Solutions

SciPhi is designed to make it easier to build, deploy and scale Retrieval-Augmented Generation (RAG) systems. With flexible infrastructure management, users can focus on AI innovation and customization without worrying about the underlying infrastructure.

To achieve this goal, SciPhi offers the following features:

  • Flexible Document Ingestion: Supports a variety of file formats, including csv, docx, html, json, pdf, and text.
  • Robust Document Management: Update or delete vectors at the user and document level.
  • Configurable Provider: Connects to third-party data retrieval sources like Serper and Exa.
  • Dynamic Scalability: Autoscales compute resources based on user demand.
  • SOTA Techniques: Deploys the latest techniques, such as HyDE and RAG-Fusion, with a single click.
  • Intelligent Assistants: Transforms conversational AI systems into informed assistants by augmenting responses with retrieved information.

SciPhi has pricing tiers to fit different project needs:

  • Free: Good for small projects, with a single developer, up to 10 pipelines, 10,000 embeddings per pipeline, and 100,000 requests per month.
  • Startup: For startups and small teams, at $499 per month, including unlimited pipelines, a team workspace, up to 1 million embeddings, and 1 million requests per month.
  • Enterprise: Custom for larger organizations, offering everything in the Startup plan plus additional features such as prioritized onboarding, on-prem deployment, and managed migration.

SciPhi is designed to make RAG development easier with features like easy configuration, total customization and fast deployment. It also integrates directly with GitHub for version control and deployment. Users can deploy directly to the cloud or run it on their own infrastructure using Docker. Detailed documentation is available for setup and advanced usage, and the project is open-source and backed by a large community of LLM application developers.

Published on June 9, 2024

Related Questions

Tool Suggestions

Analyzing SciPhi...