Showing 8 open source projects for "rope"

View related business solutions
  • Get full visibility and control over your tasks and projects with Wrike. Icon
    Get full visibility and control over your tasks and projects with Wrike.

    A cloud-based collaboration, work management, and project management software

    Wrike offers world-class features that empower cross-functional, distributed, or growing teams take their projects from the initial request stage all the way to tracking work progress and reporting results.
    Learn More
  • Infor M3 ERP Icon
    Infor M3 ERP

    Enterprise manufacturers and distributors requiring a solution to manage and execute complex processes

    Efficiently executing the complex processes of enterprise manufacturers and distributors. Infor M3 is a cloud-based, manufacturing and distribution ERP system that leverages the latest technologies to provide an exceptional user experience and powerful analytics in a multicompany, multicountry, and multisite platform. Infor M3 and related CloudSuite™ industry solutions include industry-leading functionality for the chemical, distribution, equipment, fashion, food and beverage, and industrial manufacturing industries. Staying ahead of the competition means staying agile. Our new capabilities bring improved data-driven insights and streamlined workflows to help you make informed decisions and take quick action.
    Learn More
  • 1
    GLM-4

    GLM-4

    GLM-4 series: Open Multilingual Multimodal Chat LMs

    GLM-4 is a family of open models from ZhipuAI that spans base, chat, and reasoning variants at both 32B and 9B scales, with long-context support and practical local-deployment options. The GLM-4-32B-0414 models are trained on ~15T high-quality data (including substantial synthetic reasoning data), then post-trained with preference alignment, rejection sampling, and reinforcement learning to improve instruction following, coding, function calling, and agent-style behaviors. The...
    Downloads: 3 This Week
    Last Update:
    See Project
  • 2
    tiny-llm

    tiny-llm

    A course of learning LLM inference serving on Apple Silicon

    tiny-llm is an educational open-source project designed to teach system engineers how large language model inference and serving systems work by building them from scratch. The project is structured as a guided course that walks developers through the process of implementing the core components required to run a modern language model, including attention mechanisms, token generation, and optimization techniques. Rather than relying on high-level machine learning frameworks, the codebase uses...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    Qwen2.5-Omni

    Qwen2.5-Omni

    Capable of understanding text, audio, vision, video

    Qwen2.5-Omni is an end-to-end multimodal flagship model in the Qwen series by Alibaba Cloud, designed to process multiple modalities (text, images, audio, video) and generate responses both as text and natural speech in streaming real-time. It supports “Thinker-Talker” architecture, and introduces innovations for aligning modalities over time (for example synchronizing video/audio), robust speech generation, and low-VRAM/quantized versions to make usage more accessible. It holds...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4
    DeepSeek LLM

    DeepSeek LLM

    DeepSeek LLM: Let there be answers

    ...The model is trained from scratch, reportedly on a vast multilingual + code + reasoning dataset, and competes with other open or open-weight models. The architecture mirrors established decoder-only transformer families: pre-norm structure, rotational embeddings (RoPE), grouped query attention (GQA), and mixing in languages and tasks. It supports both “Base” (foundation model) and “Chat” (instruction / conversation tuned) variants.
    Downloads: 3 This Week
    Last Update:
    See Project
  • Evertune | Improve Your Brand's Visibility in AI Search Icon
    Evertune | Improve Your Brand's Visibility in AI Search

    For enterprise marketing teams looking for a platform to understand and influence how AI models like ChatGPT recommend their products or services.

    Evertune is the Generative Engine Optimization (GEO) platform that helps brands improve visibility in AI search across ChatGPT, AI Overview, Gemini, Claude and more.
    Learn More
  • 5
    QwQ-32B

    QwQ-32B

    QwQ-32B is a reasoning-focused language model for complex tasks

    ...Recommended usage involves prompts starting with <think>\n, non-greedy sampling strategies, and support for standardized outputs on math and multiple-choice tasks. For long input handling, it supports YaRN (Yet another RoPE Namer) for context scaling.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    Qwen2.5-14B-Instruct

    Qwen2.5-14B-Instruct

    Powerful 14B LLM with strong instruction and long-text handling

    ...It demonstrates improved performance in coding, mathematics, and multilingual understanding across over 29 languages. Qwen2.5-14B-Instruct is built on a transformer backbone with RoPE, SwiGLU, RMSNorm, and attention QKV bias. It’s resilient to varied prompt styles and is especially effective for JSON and tabular data generation. The model is instruction-tuned and supports chat templating, making it ideal for chatbot and assistant use cases.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 7
    Qwen3-Next

    Qwen3-Next

    Qwen3-Next: 80B instruct LLM with ultra-long context up to 1M tokens

    ...With 80B total parameters and 3B activated at a time, it leverages hybrid attention (Gated DeltaNet + Gated Attention) and a high-sparsity Mixture-of-Experts architecture to achieve exceptional efficiency. The model natively supports a context length of 262K tokens and can be extended up to 1 million tokens using RoPE scaling (YaRN), making it highly capable for processing large documents and extended conversations. Multi-Token Prediction (MTP) boosts both training and inference, while stability optimizations such as weight-decayed and zero-centered layernorm ensure robustness. Benchmarks show it performs comparably to larger models like Qwen3-235B on reasoning, coding, multilingual, and alignment tasks while requiring only a fraction of the training cost.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 8
    Devstral Small 2

    Devstral Small 2

    Lightweight 24B agentic coding model with vision and long context

    Devstral Small 2 is a compact agentic language model designed for software engineering workflows, excelling at tool usage, codebase exploration, and multi-file editing. With 24B parameters and FP8 instruct tuning, it delivers strong instruction following while remaining lightweight enough for local and on-device deployment. The model achieves competitive performance on SWE-bench, validating its effectiveness for real-world coding and automation tasks. It introduces vision capabilities,...
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB