PanGu-Σ

PanGu-Σ

Huawei
PanGu-α

PanGu-α

Huawei
+
+

Related Products

  • Perplexity Computer
    26 Ratings
    Visit Website
  • Dynamo Software
    68 Ratings
    Visit Website
  • Datasite Diligence Virtual Data Room
    640 Ratings
    Visit Website
  • Google Cloud BigQuery
    2,008 Ratings
    Visit Website
  • LTX
    181 Ratings
    Visit Website
  • Concord
    237 Ratings
    Visit Website
  • SKU Science
    16 Ratings
    Visit Website
  • LM-Kit.NET
    26 Ratings
    Visit Website
  • Vertex AI
    961 Ratings
    Visit Website
  • Google AI Studio
    11 Ratings
    Visit Website

About

Significant advancements in the field of natural language processing, understanding, and generation have been achieved through the expansion of large language models. This study introduces a system which utilizes Ascend 910 AI processors and the MindSpore framework to train a language model with over a trillion parameters, specifically 1.085T, named PanGu-{\Sigma}. This model, which builds upon the foundation laid by PanGu-{\alpha}, takes the traditionally dense Transformer model and transforms it into a sparse one using a concept known as Random Routed Experts (RRE). The model was efficiently trained on a dataset of 329 billion tokens using a technique called Expert Computation and Storage Separation (ECSS), leading to a 6.3-fold increase in training throughput via heterogeneous computing. Experimentation indicates that PanGu-{\Sigma} sets a new standard in zero-shot learning for various downstream Chinese NLP tasks.

About

PanGu-α is developed under the MindSpore and trained on a cluster of 2048 Ascend 910 AI processors. The training parallelism strategy is implemented based on MindSpore Auto-parallel, which composes five parallelism dimensions to scale the training task to 2048 processors efficiently, including data parallelism, op-level model parallelism, pipeline model parallelism, optimizer model parallelism and rematerialization. To enhance the generalization ability of PanGu-α, we collect 1.1TB high-quality Chinese data from a wide range of domains to pretrain the model. We empirically test the generation ability of PanGu-α in various scenarios including text summarization, question answering, dialogue generation, etc. Moreover, we investigate the effect of model scales on the few-shot performances across a broad range of Chinese NLP tasks. The experimental results demonstrate the superior capabilities of PanGu-α in performing various tasks under few-shot or zero-shot settings.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

AI developers

Audience

AI developers interested in a powerful large language model

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

No images available

Screenshots and Videos

No images available

Pricing

No information available.
Free Version
Free Trial

Pricing

No information available.
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

Huawei
Founded: 1987
China
huawei.com

Company Information

Huawei
Founded: 1987
China
arxiv.org/abs/2104.12369

Alternatives

LTM-1

LTM-1

Magic AI

Alternatives

PanGu-Σ

PanGu-Σ

Huawei
PanGu-α

PanGu-α

Huawei
OPT

OPT

Meta
DeepSeek-V2

DeepSeek-V2

DeepSeek
GPT-NeoX

GPT-NeoX

EleutherAI
VideoPoet

VideoPoet

Google

Categories

Categories

Integrations

PanGu Chat

Integrations

PanGu Chat
Claim PanGu-Σ and update features and information
Claim PanGu-Σ and update features and information
Claim PanGu-α and update features and information
Claim PanGu-α and update features and information