London-headquartered Oriole Networks, a startup that uses light to train LLMs a hundred times faster with only a fraction of power, has raised $22 million in Series A funding. Plural (which recently backed VSParticle and Teton.ai) led the round, with all existing investors, including UCL Technology Fund, XTX Ventures, Clean Growth Fund, and Dorilton Ventures.
With this, the total funding raised by the company to date accounts for $35 million, including the £10 million in seed funding it secured earlier this year. The fresh investment will be used to accelerate the fast pace of growth, expanding the team and engaging with high-volume suppliers. It will also help the company scale its “super-brain” solution. By 2025, it plans to have its early-stage products in the hands of customers, creating an ecosystem of photonic networking for AI.
In addition, Ian Hogarth, who led the investment on behalf of Plural, will be joining the board to help support the team through this crucial scaling phase.
Offsets AI energy consumption
Energy consumption due to AI is a major problem with reports suggesting that ChatGPT alone uses enough energy to power the entire country of Gibraltar for a year. A single ChatGPT query uses more than 25x more energy than a Google search, according to research from Stanford University. At the same time, AI computing power is doubling every 100 days and is set to increase by more than a million times in the next five years. Oriole’s tech addresses this demand without sacrificing our planet to do so.
Who is behind Oriole Networks?
The company was founded in 2023 out of University College London (UCL) by Professor George Zervas, Alessandro Ottino, and Joshua Benjamin. They have an extremely experienced commercial team to bring the technology and unique IP developed over two decades at UCL to a sector that is crying out for a solution to its sustainability problems. It has an experienced, proven photonics team, including a team out of Lumentum.
Develops AI super-brain
Orior Networks has developed a novel way of using light to connect thousands of AI chips. Once connected, the power of each graphics processing unit (GPU) is combined to form a super-brain. As per the company, this super-brain can train advanced Large Language Models a hundred times faster, whilst consuming a fraction of power, allowing algorithms to run with much lower latency.
This technology reduces the energy consumption of data centres which is putting a huge strain on energy grids in both the US and Europe. If data centre demand triples by 2035, as expected, and developers struggle to install new wind and solar, power sector emissions could be more than 56% higher than forecast, according to research by Rhodium Group.
James Regan, CEO of Oriole Networks, said: “This funding is yet another milestone for Oriole following a year of rapid pace and growth. This is a booming market desperate for solutions and our ambition is to create an ecosystem of photonic networking that can reshape this industry by solving today’s bottlenecks and enabling greater competition at the GPU layer. Building on decades of research, we’re paving the way for faster, more efficient, more sustainable AI.”
Ian Hogarth, Partner at Plural, said: “Applying 20 years of deep research and learning in photonics to create a better AI infrastructure demonstrates how much more innovation there is to come to help reap the benefits of this technology. The team behind Oriole Networks have proven experience in both company building and bringing deep science to commercialisation and are creating a fundamental shift in the design of next generation networked systems that will reduce latency and slash the energy impact of data centres on which we now rely.”
The post Oriole Networks lands $22M to train LLMs 100 times faster with AI super-brain appeared first on Tech Funding News.