Raghavan tells Poulomi Chatterjee that being a full-stack platform, it should make every Indian’s life better. Excerpts: ...
Sarvam AI Co-founder Pratyush Kumar says the company has trained 30-billion-parameter and 105-billion-parameter models from ...
“The chip combines the low latency of SRAM-first designs with the long-context support of HBM,” MatX co-founder and Chief ...
Bengaluru-based AI startup Sarvam AI on February 18 announced the launch of two new large language models, a 30-billion-parameter model and a 105-billion-parameter model, both trained from scratch, ...
The new lineup includes 30-billion and 105-billion parameter models; a text-to-speech model; a speech-to-text model; and a vision model to parse documents.
In recent ground tests, Boeing engineers demonstrated that a large language model running on commercial off-the-shelf hardware could examine telemetry and report in natural language on the health of a ...
Approximately 0.4 trillion tokens of pre-training were conducted using cloud resources provided by Google Cloud Japan under the support of the Ministry of Economy, Trade and Industry’s GENIAC project.
Fledgling Indian artificial intelligence companies showcased homegrown technologies this week at a major summit in New Delhi, underpinning big dreams of becoming a global AI power. Another Indian ...
With GPT‑NL, TNO - together with SURF and the Netherlands Forensic Institute (NFI) is building an independent Dutch language ...