ML++

The Revolutionary AI-First Systems Programming Language

100% C/C++ compatible • Compile-time tensor & Shape verification • Native automatic differentiation • Zero-cost neural network abstractions • Reflection built-in

Get Started View Documentation
// Valid C++ is valid ML++ - 100% compatible!
#include <ml++/nn.h>

// Neural network as a C++ struct
struct MNISTNet {
    ml::nn::Linear<784, 128> fc1;
    ml::nn::Linear<128, 10> fc2;
    
    // Forward pass - looks like normal C++
    tensor<float, [?, 784]> forward(auto x) {
        return fc2(relu(fc1(x)));  // Shapes checked at compile-time!
    }
};

// Reflection - introspect at compile-time
constexpr auto params = ^MNISTNet.parameters();
static_assert(params.count() == 101770); // Compile-time check!

True Superset of C and C++

Every valid C and C++ program compiles in ML++ with identical semantics. We add powerful AI extensions without breaking compatibility.

ML++

Tensors • Autodiff • Reflection • Neural Networks

C++23

Templates • Concepts • Modules

C++17

Modern C++

C

Foundation

Every C Program

All valid C code compiles and runs identically in ML++

Every C++ Program

C++98 through C++23 - complete compatibility

🚀

Plus AI Extensions

Native tensors, gradients, and neural networks

Revolutionary Features

ML++ brings AI/ML development to compile-time while maintaining full C/C++ compatibility

🎯

Compile-Time Shape Checking

Tensor dimension errors caught at compile-time, not during training. No more runtime shape mismatches after hours of GPU time.

Native Automatic Differentiation

The ∇ operator computes gradients. Compiler generates optimal backpropagation code - no runtime graph construction overhead.

🔍

Compile-Time Reflection

Introspect types, iterate members, generate code - all at compile-time. Automatic serialization, Python bindings, and more.

🚀

Zero-Cost Neural Networks

Layers and models are language constructs, not classes. All overhead eliminated at compile-time for bare-metal performance.

🖥️

Hardware Agnostic

Write once, compile to CPU/GPU/TPU/distributed. Compiler generates optimal code for each target automatically.

C++ Familiar Syntax

Looks and feels like C++. If you know C++, you already know most of ML++. Layers are structs, training is normal loops.

Language Comparisons

ML++ vs other languages for AI/ML development

ML++ vs C++

Feature C++ ML++
All C++ Features ✅ (100% compatible)
Tensors ❌ (Eigen library) ✅ Native type
Shape Checking ❌ Runtime only ✅ Compile-time
Automatic Differentiation ❌ (PyTorch C++ lib) ✅ Native (∇)
Neural Networks ❌ Library only ✅ Language construct
Reflection ⚠️ Limited (RTTI) ✅ Compile-time full
Performance ⚡ Native ⚡ Native (same)

ML++ vs Python/PyTorch

Feature PyTorch ML++
Language Base Python C++ superset
Type Checking Runtime Compile-time
Shape Verification Runtime errors Compile-time errors
Gradients Runtime graph Compile-time code gen
Performance Good (C++ backend) Excellent (native)
Deployment Python + dependencies Single binary
Learning Curve Easy Moderate (C++ knowledge)

ML++ vs Other AI Languages

Feature JAX Rust+Candle ML++
Base Language Python Rust C/C++
Compilation JIT (XLA) AOT AOT (native)
Type Safety Dynamic Strong Strong + shapes
ML Integration Transform API Library Native language
Ecosystem Huge (Python) Growing Massive (C++)
Performance Excellent Excellent Excellent

Get Started

Build ML++ from source and start creating AI applications with compile-time guarantees

1

Build ML++ Compiler

Clone and build the ML++-enabled Clang compiler (fork of LLVM with ML++ extensions).

git clone https://github.com/cognosphere/mlpp.git cd mlpp && mkdir build && cd build cmake -G Ninja \ -DCMAKE_BUILD_TYPE=Release \ -DCMAKE_INSTALL_PREFIX=/usr/local/mlpp \ -DLLVM_ENABLE_PROJECTS="clang;lld" \ -DLLVM_TARGETS_TO_BUILD="X86;NVPTX" \ ../llvm ninja -j$(nproc) sudo ninja install
2

Write Your First Model

Create a neural network using familiar C++ syntax with ML++ extensions.

// mnist.mlpp - Looks like C++!
#include <ml++/nn.h>

struct MNISTNet {
    ml::nn::Linear<784, 128> fc1;
    ml::nn::Linear<128, 10> fc2;
    
    auto forward(auto x) {
        return fc2(relu(fc1(x)));
    }
};

int main() {
    MNISTNet model;
    ml::optim::Adam opt(model.parameters(), 0.001);
    
    // Normal C++ training loop
    for (auto [imgs, labels] : train_data) {
        auto loss = ml::nn::cross_entropy(model(imgs), labels);
        loss.backward();
        opt.step();
    }
}
3

Compile and Run

Compile to native code with full shape verification and optimization.

# Compile with shape verification mlpp++ -std=ml++23 mnist.mlpp -o mnist -O3 # Run on GPU ./mnist --device=gpu # Shapes verified at compile time! # No runtime dimension errors possible.

100%

C/C++ Compatible

0

Runtime Overhead

Native Autodiff

🔍

Compile-time Reflection

Our Partners

ML++ is developed and supported by leading organizations in software development, aerospace, and healthcare

Cognosphere Dynamics Limited Logo
Trajectory Inc Logo
Good Shepherd General Hospital Logo
Cognosphere Dynamics Limited Logo
Trajectory Inc Logo
Good Shepherd General Hospital Logo
Trajectory Inc Logo
Good Shepherd General Hospital Logo

These organizations collaborate to advance ML++ development, provide real-world testing environments, and ensure the language meets the demanding requirements of production systems across multiple industries.