While leading tech companies race to build ever-larger models, researchers in Brazil, India and Africa are using clever tricks to remix big labs’ LLMs to bring AI to billions of users.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$32.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to the full article PDF.
USD 39.95
Prices may be subject to local taxes which are calculated during checkout
References
Vaswani, A. et al. Attention is all you need. In Proc. 31st Conference on Neural Information Processing Systems (NeurIPS, 2017).
Sane, S. et al. MARC: A multi-scale and multi-lingual code-switching benchmark for large language models. In Proc. 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (ACL Anthology, 2024).
Shormani, M. Q. & Al‑Samki, A. A. F1000Res. https://doi.org/10.12688/f1000research.165879.1 (2025).
Foo, L. T. E. & Ng, L. H. X. Disentangling Singlish discourse particles with task-driven representation. In Proc. 6th ACM International Conference on Multimedia in Asia Workshops (Association for Computing Machinery, 2024).
Ng, R. et al. Preprint at https://arxiv.org/abs/2504.05747 (2025).
Gala, J. et al. Preprint at https://arxiv.org/abs/2401.15006 (2024).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Burgos, P. The other AI revolution: how the Global South is building and repurposing language models that speak to billions. Nat Comput Sci 5, 691–694 (2025). https://doi.org/10.1038/s43588-025-00865-y
Published:
Version of record:
Issue date:
DOI: https://doi.org/10.1038/s43588-025-00865-y