Questions? +1 (202) 335-3939 Login
Trusted News Since 1995
A service for technology industry professionals · Thursday, June 19, 2025 · 823,692,396 Articles · 3+ Million Readers

Launch of BrainChip Developer Hub Accelerates Event-Based AI Innovation on Akida™ Platform with Release of MetaTF 2.13

BrainChip announces new Developer Hub and MetaTF toolkit, enabling seamless development and deployment of machine learning models on its Akida™ platform.

LAGUNA HILLS, CA, UNITED STATES, June 19, 2025 /EINPresswire.com/ -- BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based brain-inspired AI, today announced the release of MetaTF 2.13 on its newly launched Developer Hub, a comprehensive portal designed to accelerate AI development on the Akida™ platform. The BrainChip Developer Hub serves as a centralized resource for developers building intelligent edge applications, providing access to tools, pre-trained models, technical documentation, and the company’s MetaTF toolkit. MetaTF 2.13 features seamless conversion, quantization, and deployment of machine learning models on Akida. It is compatible with leading frameworks including Keras and ONNX with support for Jupyter Notebooks, enabling rapid prototyping and optimization.

“We created the Developer Hub to streamline the experience for edge AI developers and give them the tools to move from concept to deployment quickly,”, said Sean Hehir, CEO of BrainChip. “With our Akida processor, highly intuitive software stack, and world class models, we’re delivering solutions that are both high-performing and energy-efficient.”

As part of this launch, BrainChip introduced two high-efficiency models optimized for edge performance. The eye-tracking model is ideal for smart glasses and wearable devices, delivering over 99% accuracy. Built on BrainChip’s proprietary Temporal Event-based Neural Networks (TENNs), it offers real-time gaze detection while dramatically reducing power consumption by processing only motion-relevant data.

The gesture recognition model is designed for embedded applications in consumer electronics, robotics, and IoT and achieves 97% accuracy. By leveraging Akida’s event-based processing and high-speed vision sensors from event-based cameras, it enables ultra-low latency gesture interfaces without sacrificing precision.

These models demonstrate the power of Akida’s event-based architecture across a wide array of real-world applications including autonomous vehicles, industrial automation, AR/VR and spatial computing, smart environments and IoT, and security and surveillance.

BrainChip’s new Developer Hub and AI models underscore the company’s commitment to making edge AI more accessible and scalable. With Akida, developers can build responsive, privacy-aware applications that operate at ultra-low power—ideal for battery-constrained and latency-sensitive environments.

Developers can access the models and tools today by visiting: https://developer.brainchip.com

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)
BrainChip is the global leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, AkidaTM, uses neuromorphic principles to mimic the human brain. By analyzing only the essential sensor inputs at the point of acquisition, Akida delivers data processing with unmatched efficiency, precision, and energy savings. Integrated into SoCs on any digital process technology, Akida Neural Processor IP has demonstrated significant advantages across today's workloads and networks. It provides a platform for developers to build, fine-tune, and run their models using standard AI tools such as TensorFlow and Keras.

BrainChip’s Temporal Event-based Neural Networks (TENNs) build on state space models (SSMs) by introducing a time-sensitive, event-driven processing framework that enhances efficiency and makes them ideal for real-time, streaming Edge applications. By enabling efficient computation with optimized models and hardware execution, BrainChip makes real-time streaming Edge AI universally deployable across industries such as aerospace, autonomous vehicles, robotics, mobile, consumer electronics, and wearable technology. BrainChip is leading the way toward a future where ultra-low power, on-chip AI near the sensor not only transforms products but also benefits the planet. Learn more at www.brainchip.com.

Follow BrainChip on Twitter: @BrainChip_inc
Follow BrainChip on LinkedIn: BrainChip LinkedIn

Madeline Coe
Bospar Communications
+1 224-433-9056
maddie@bospar.com

Powered by EIN Presswire

Distribution channels: Technology

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Submit your press release