Rakuten Group was chosen to power a Japanese generative AI accelerator programme intended to position the nation as a front-runner in the technology.

The company is set to commence R&D into an open AI foundation model integrating fresh approaches to hike the memory capabilities, which Rakuten stated would effectively increase information available to generate responses.

Rakuten is handling the research in the third round of the Generative AI Accelerator Challenge, a project promoted by the country’s Ministry of Economy, Trade and Industry along with the New Energy and Industrial Technology Development Organisation.

The company explained the project “primarily provides support for computing resources necessary for generative AI development”, along with promoting knowledge sharing based on the “latest technology and developer community trends”.

Rakuten explained the first phase of the project commenced in February 2024, the second in October 2025 and the upcoming edition is scheduled to begin next month.

It noted it had been “actively developing and releasing Japanese language-optimised AI models to the open-source community since March 2024”. Its focus has been on “smaller, highly efficient models”.

Rakuten intends to use the expanded capabilities developed for the Japanese project to “overcome current limitations in generative AI memory”.

Enabling large language models to recall past interactions could pave the way to moving beyond current transformer architectures, which the company stated “struggle with extended context windows”.

Rakuten is also seeking further efficiency gains “through better training and inference algorithms” to open fresh potential for personalised AI services.