Chainer
6.0.0b2

Tutorials

  • Chainer at a Glance
  • Concepts Walkthrough

Examples

  • Neural Net Examples
  • Colab Notebook Examples
  • Awesome Chainer

References

  • API Reference
  • Installation
  • Distributed Deep Learning with ChainerMN
    • Installation
    • Tutorial
    • Model Parallel
      • Overview
      • Model Parallel on ChainerMN
      • Example 1: Simple MLP
      • Example 2: seq2seq
      • Example 3: Channel-wise Parallel Convolution
      • Example 4: Ensemble
    • API Reference

Other

  • API Compatibility Policy
  • Contribution Guide
  • Tips and FAQs
  • Performance Best Practices
  • Upgrade Guide
  • Comparison with Other Frameworks
  • License

Community

  • Slack Chat
  • Forums
Chainer
  • Docs »
  • Distributed Deep Learning with ChainerMN »
  • Model Parallel

Model ParallelΒΆ

  • Overview
    • Model Parallelism
    • Philosophy
    • References
  • Model Parallel on ChainerMN
    • Step 1: Communicators
    • Step 2: Datasets and Iterators
    • Step 3: Define Communications
    • Note: Define-by-Run and Model Parallelism
    • Note: Delegate Variable and Pseudo Connect
  • Example 1: Simple MLP
  • Example 2: seq2seq
  • Example 3: Channel-wise Parallel Convolution
  • Example 4: Ensemble
Next Previous

© Copyright 2015, Preferred Networks, inc. and Preferred Infrastructure, inc.

Built with Sphinx using a theme provided by Read the Docs.