# AscendNPU IR users This project enables multiple operator programming frameworks to target the Ascend backend and provides Ascend-oriented compilation and optimization. Below are examples of languages and frameworks that have integrated or use AscendNPU IR. ## Language ecosystem | DSL | Description | | --- | --- | | [Triton-Ascend](https://gitcode.com/Ascend/triton-ascend) | Enables Triton developers to quickly develop Ascend operators and migrate ecosystems | | [TileLang-Ascend (branch npuir)](https://github.com/tile-ai/tilelang-ascend/tree/npuir) | Tile-level programming for high-performance kernels, balancing productivity and low-level optimization | | [DLCompiler](https://github.com/DeepLink-org/DLCompiler) | Deep learning compiler extending Triton, with cross-architecture DSL extension and automatic optimization | | [FlagTree](https://github.com/flagos-ai/flagtree) | Open-source AI compiler based on Triton with unified compilation across multiple backends |