Background

InertialAI-Alpha

A 325M parameter foundation model that reads text and time-series as a single language. Designed for finetuning on your domain data.

325M
Parameters
138B
Text Tokens
384B
Time-Series Datapoints
2048
Token Context Length

Unified Intelligence

Your infrastructure generates text and time-series simultaneously. InertialAI Alpha is built to understand both natively.

Traditional approaches stitch together separate models, losing context at the seams. By projecting continuous sensor data into the same vector space as language tokens, InertialAI Alpha can reason about physical systems with the semantic depth of an LLM. It sees a latency spike and a database error not as two different data types, but as part of a single, unfolding narrative.

Designed for Adaptation

General purpose models fail on proprietary data. InertialAI Alpha is architected to be efficiently finetuned on your unique sensor arrays, text, and domain concepts.

Multi-modal Fusion

InertialAI Alpha can be applied to your specific time-series/sensor data or text simultaneously, ensuring they are treated as first-class data types.

Sample Efficiency

Because it's pretrained on 150B tokens of physical signals, InertialAI Alpha learns your specific patterns with 100x fewer examples than training from scratch.

Data Privacy

Finetuning happens within your VPC. Your proprietary manufacturing data or patient data never leave your secure environment.

Competent in Both Regimes

InertialAI Alpha was trained on a mix of text and time-series data. It maintains competitive performance in both modalities simultaneously.

Language Understanding

Despite being trained on only 150B tokens (vs 6T-10T for competitors), InertialAI Alpha achieves strong performance on standard reasoning benchmarks.

Benchmark
Alpha (325M)
Gemma 3 (270M)LFM-2 (350M)
Training Tokens
138B
6T10T
HellaSwag (0-shot)
43.4%
40.1%48.3%
PIQA (10-shot)
69.5%
67.6%69.8%
BoolQ (10-shot)
56.5%
51.7%57.2%
Winograd (0-shot)
67.4%
65.2%60.8%
InertialAI Alpha matches or exceeds models trained on 40x more text data.

Probabilistic Forecasting

Evaluated on CRPS (Continuous Ranked Probability Score) to probe time-series performance. Lower is better.

Domain
Alpha
TFM v1TFM v2Naive
Web/CloudOps
0.63
0.940.661.00
Energy (Grid Load)
0.89
0.780.671.00
Transport
0.64
0.590.501.00
Retail (Sales)
0.48
0.420.421.00
Nature (Weather)
0.72
0.410.351.00
These benchmarks probe representation quality. InertialAI Alpha is designed to be finetuned for downstream multi-modal applications.

Real-World Applications

See how InertialAI Alpha handles complex, multi-modal tasks that require reasoning across text and time-series. We compare performance against two popular deep learning architectures (LSTM, Transformer) and the top performing machine learning methods for each task.

Predicting Cardiac Events

Early detection of cardiac events in the ER requires analyzing two distinct data sources: continuous ECG waveforms from bedside monitors and unstructured clinical notes from triage. Standard approaches process these separately, missing critical correlations between physiological signals and patient history.

InertialAI Alpha jointly encodes the ECG time-series and clinical text into a shared representation. This allows the model to detect patterns like arrhythmias that coincide with reported symptoms of faintness. By understanding both modalities together, it achieves higher accuracy than single-modality baselines like Transformers and LSTMs.

Predicting Cardiac Events Diagram
ModelAUROC (Higher is Better)Performance Gap
InertialAI Alpha
0.934
-
Transformer0.891-4.6%
XGBoost0.862-7.7%
LSTM0.814-12.8%

Ready to build with InertialAI Alpha?

Request Access